Face Recognition against My Photo Library

The iCloud photos from iOS or Mac OS provided a comprenesive way to tag faces and then all the photos in the library will be scanned, faces will be identified and tagged accordingly based on the earlier user defined tags.

I am a photography enthusiast and I have started to taking photos from year 2004. As of today, it's over 200k shots and I've kept 40k of them. There are many desktop applications can do the job of face recognition, but it's going to be super fun if I can build the solution end to end.

This report will have following parts.

  1. Get photos and preprocess.
  2. Get the features of the baseine photos.
  3. Train the model to perform the classification.
  4. Evaluate the model.

Part 1 - Get photos and preprocessing

A glance at the folder structure

All my photos are in D:\Pictures, majority of them are in both .jpg and .nef format. The .nef is a raw image format for Nikon cameras and .jpg is the copy after image post-processing of raw file.

In [1]:
import os
import time
cur_dir = os.getcwd()
print(cur_dir)
target_image_dir = os.path.join(cur_dir, 'images')
photo_dir = 'D:\Pictures'
os.listdir(photo_dir)
D:\Google Drive\Study\Deep Learning Developer\Projects\Project 4 - Face Recognition Against My Photo Library
Out[1]:
['.SynologyWorkingDirectory',
 '2004',
 '2005',
 '2006',
 '2007',
 '2008',
 '2009',
 '2010',
 '2011',
 '2012',
 '2013',
 '2014',
 '2015',
 '2016',
 '2017',
 'Adobe Lightroom',
 'Camera Roll',
 'desktop.ini',
 'iCloud Photos',
 'naming instruction.txt',
 'Phone Photos',
 'Photography Works',
 'Saved Pictures',
 'zbingjie',
 'zothers',
 '法蝶',
 '熊思宇和黄乐论辩论']

One sample of a recent photo directory. Each and every file use the time stamp as its file name. .nef is the raw image file, .xmp is generated by Adobe Lightroom, both are not applicable to this project.

In [2]:
os.listdir(photo_dir + '/2017/2017.11.16 - Singapore Fintech Festival')
Out[2]:
['20171112-2038.jpg',
 '20171112-2038.NEF',
 '20171112-2038.xmp',
 '20171112-2039.jpg',
 '20171112-2039.NEF',
 '20171112-2039.xmp',
 '20171116-1705.jpg',
 '20171116-1705.NEF',
 '20171116-1705.xmp',
 '20171116-1707.jpg',
 '20171116-1707.NEF',
 '20171116-1707.xmp',
 '20171116-1708.jpg',
 '20171116-1708.NEF',
 '20171116-1708.xmp',
 '20171116-1709.jpg',
 '20171116-1709.NEF',
 '20171116-1709.xmp',
 '20171116-1710.jpg',
 '20171116-1710.NEF',
 '20171116-1710.xmp',
 '20171116-1713.jpg',
 '20171116-1713.NEF',
 '20171116-1713.xmp',
 '20171116-1715.jpg',
 '20171116-1715.NEF',
 '20171116-1715.xmp',
 '20171116-1716.jpg',
 '20171116-1716.NEF',
 '20171116-1716.xmp',
 '20171116-1717.jpg',
 '20171116-1717.NEF',
 '20171116-1717.xmp',
 '20171116-1718-2.jpg',
 '20171116-1718-2.NEF',
 '20171116-1718-2.xmp',
 '20171116-1718.jpg',
 '20171116-1718.NEF',
 '20171116-1718.xmp',
 '20171116-1720.jpg',
 '20171116-1720.NEF',
 '20171116-1720.xmp',
 '20171116-1721-2.jpg',
 '20171116-1721-2.NEF',
 '20171116-1721-2.xmp',
 '20171116-1721.jpg',
 '20171116-1721.NEF',
 '20171116-1721.xmp',
 '20171116-1722-2.jpg',
 '20171116-1722-2.NEF',
 '20171116-1722-2.xmp',
 '20171116-1722.jpg',
 '20171116-1722.NEF',
 '20171116-1722.xmp',
 '20171116-1727-2.jpg',
 '20171116-1727-2.NEF',
 '20171116-1727-2.xmp',
 '20171116-1727.jpg',
 '20171116-1727.NEF',
 '20171116-1727.xmp',
 '20171116-1729.jpg',
 '20171116-1729.NEF',
 '20171116-1729.xmp',
 '20171116-1731-4.jpg',
 '20171116-1731-4.NEF',
 '20171116-1731-4.xmp',
 '20171116-1731-5.jpg',
 '20171116-1731-5.NEF',
 '20171116-1731-5.xmp',
 '20171116-1731-6.jpg',
 '20171116-1731-6.NEF',
 '20171116-1731-6.xmp',
 '20171116-1731.jpg',
 '20171116-1731.NEF',
 '20171116-1731.xmp',
 '20171116-1732.jpg',
 '20171116-1732.NEF',
 '20171116-1732.xmp',
 '20171116-1733-2.jpg',
 '20171116-1733-2.NEF',
 '20171116-1733-2.xmp',
 '20171116-1733.jpg',
 '20171116-1733.NEF',
 '20171116-1733.xmp',
 '20171116-1734.jpg',
 '20171116-1734.NEF',
 '20171116-1734.xmp',
 '20171116-1735.jpg',
 '20171116-1735.NEF',
 '20171116-1735.xmp',
 '20171116-1737.jpg',
 '20171116-1737.NEF',
 '20171116-1737.xmp',
 '20171116-1744.jpg',
 '20171116-1744.NEF',
 '20171116-1744.xmp',
 '20171116-1745.jpg',
 '20171116-1745.NEF',
 '20171116-1745.xmp',
 '20171116-1746-2.jpg',
 '20171116-1746-2.NEF',
 '20171116-1746-2.xmp',
 '20171116-1746.jpg',
 '20171116-1746.NEF',
 '20171116-1746.xmp',
 '20171116-1747.jpg',
 '20171116-1747.NEF',
 '20171116-1747.xmp',
 '20171116-1752.jpg',
 '20171116-1752.NEF',
 '20171116-1752.xmp',
 '20171116-1817.jpg',
 '20171116-1817.NEF',
 '20171116-1817.xmp',
 '20171116-1826.jpg',
 '20171116-1826.NEF',
 '20171116-1826.xmp',
 '20171116-1839.jpg',
 '20171116-1839.NEF',
 '20171116-1839.xmp',
 '20171116-1840.jpg',
 '20171116-1840.NEF',
 '20171116-1840.xmp',
 '20171116-1841.jpg',
 '20171116-1841.NEF',
 '20171116-1841.xmp',
 '20171116-1923.jpg',
 '20171116-1923.NEF',
 '20171116-1923.xmp',
 '20171116-1926-3.jpg',
 '20171116-1926-3.NEF',
 '20171116-1926-3.xmp',
 '20171116-1926-4.jpg',
 '20171116-1926-4.NEF',
 '20171116-1926-4.xmp',
 '20171116-1926.jpg',
 '20171116-1926.NEF',
 '20171116-1926.xmp',
 '20171116-1927-5.jpg',
 '20171116-1927-5.NEF',
 '20171116-1927-5.xmp',
 '20171116-1927-6.jpg',
 '20171116-1927-6.NEF',
 '20171116-1927-6.xmp',
 '20171116-1927-7.jpg',
 '20171116-1927-7.NEF',
 '20171116-1927-7.xmp',
 '20171116-1927-8.jpg',
 '20171116-1927-8.NEF',
 '20171116-1927-8.xmp',
 '20171116-1927.jpg',
 '20171116-1927.NEF',
 '20171116-1927.xmp',
 '20171116-1928-3.jpg',
 '20171116-1928-3.NEF',
 '20171116-1928-3.xmp',
 '20171116-1928-4.jpg',
 '20171116-1928-4.NEF',
 '20171116-1928-4.xmp',
 '20171116-1928.jpg',
 '20171116-1928.NEF',
 '20171116-1928.xmp',
 '20171116-1929-3.jpg',
 '20171116-1929-3.NEF',
 '20171116-1929-3.xmp',
 '20171116-1929-4.jpg',
 '20171116-1929-4.NEF',
 '20171116-1929-4.xmp',
 '20171116-1929.jpg',
 '20171116-1929.NEF',
 '20171116-1929.xmp',
 '20171116-1930-2.jpg',
 '20171116-1930-2.NEF',
 '20171116-1930-2.xmp',
 '20171116-1930.jpg',
 '20171116-1930.NEF',
 '20171116-1930.xmp',
 '20171116-1936.jpg',
 '20171116-1936.NEF',
 '20171116-1936.xmp']

Create face images

Iteratively going through all the images and save the face to a jpg file into the working directory. The value of the scaling factor in the cascade classifier affects the number of false positive. Some manual process will be needed.

In [120]:
# Import required libraries for this section
%matplotlib inline
import numpy as np
import matplotlib.pyplot as plt
import math
import cv2
import time
In [64]:
def show_faces(file_path, display=False, save=False, scaleFactor=1.3, minNeighb=5):
    print('Image path', file_path)
    
    # The file path contains unicode characters, cannot use cv2.imread() directly
    file_stream = open(file_path, 'rb')
    bytes_arr = bytearray(file_stream.read())
    numpy_ar = np.asarray(bytes_arr, dtype=np.uint8)
    image = cv2.imdecode(numpy_ar, cv2.IMREAD_UNCHANGED)
    print(image.shape)
    
    # Convert to RGB
    image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)
    # Convert the RGB  image to grayscale
    gray = cv2.cvtColor(image, cv2.COLOR_RGB2GRAY)

    # Extract the pre-trained face detector from an xml file
    face_cascade = cv2.CascadeClassifier('detector_architectures/haarcascade_frontalface_default.xml')

    # Detect the faces in image
    faces = face_cascade.detectMultiScale(gray, scaleFactor, minNeighb)

    # Print the number of faces detected in the image
    print('Number of faces detected:', len(faces))

    # Make a copy of the orginal image to draw face detections on
    image_with_detections = np.copy(image)

    # The list of detected faces
    image_faces = []
    # Get the bounding box for each detected face
    for (x,y,w,h) in faces:
        # Add a red bounding box to the detections image
        if w > 200:
            line_width = w//20
        else:
            line_width = 3
        image_faces.append(image[y:(y+h), x:(x+w)])
        cv2.rectangle(image_with_detections, (x,y), (x+w,y+h), (255,0,0), line_width)
    
    if save:
        save_faces(file_path, image_faces)

    if display:
        # Display the image with the detections
        fig = plt.figure(figsize=(10, 10))
        ax = fig.add_subplot(1, 1, 1, xticks=[], yticks=[])
        ax.set_title('Sample Image')
        ax.imshow(image_with_detections)
    os.chdir(cur_dir)
In [5]:
# pathlib available from python 3.5
from pathlib import Path
def save_faces(file_path, image_faces):
    if len(image_faces) == 0:
        return
    # Save the each face into individual files
    target_file = file_path.replace(photo_dir, target_image_dir)
    target_dir = os.path.dirname(target_file)
    target_path = Path(target_dir)
    
    # Create parents of directory, don't raise exception if the directory exists
    target_path.mkdir(parents=True, exist_ok=True)
        
    for i, face in enumerate(image_faces):
        face = cv2.resize(face, (299, 299))
        os.chdir(target_dir)
        file_name = os.path.basename(target_file)
        cv2.imwrite(file_name + '-face-' + str(i) + '.jpg', cv2.cvtColor(face, cv2.COLOR_BGR2RGB))
        
In [65]:
# Load in color image for face detection
file_path = os.path.join(photo_dir, '2017\\2017.11.16 - Singapore Fintech Festival', '20171116-1923.jpg')
show_faces(file_path, True)
Image path D:\Pictures\2017\2017.11.16 - Singapore Fintech Festival\20171116-1923.jpg
(4760, 7132, 3)
Number of faces detected: 5
In [6]:
all_jpg = []
for root, dirs, files in os.walk(photo_dir):
    # All the target photos are in D:\Pictures\20xx. Get the jpgs from them only.
    path = root.split(os.sep)
    if len(path) < 3:
        continue
    else:
        year = path[2]
        if year[:2] != '20':
            continue
    #print((len(path) - 1) * '---', os.path.basename(root))
    for file in files:
        if file[-3:].lower() == 'jpg':
            #print(len(path) * '---', file)
            all_jpg.append(os.path.join(root, file))
In [7]:
print('Number of jpgs:', len(all_jpg))
all_jpg[:10]
Number of jpgs: 39687
Out[7]:
['D:\\Pictures\\2004\\2004.12.16 - 保存的第一张数码照片\\20041216-2037.JPG',
 'D:\\Pictures\\2004\\2004.12.21~24 - 刚到新加坡,bukit timah hill, befriender, 猴子, oldham hall及附近\\20041221-0828.jpg',
 'D:\\Pictures\\2004\\2004.12.21~24 - 刚到新加坡,bukit timah hill, befriender, 猴子, oldham hall及附近\\20041221-0829.jpg',
 'D:\\Pictures\\2004\\2004.12.21~24 - 刚到新加坡,bukit timah hill, befriender, 猴子, oldham hall及附近\\20041221-1004.jpg',
 'D:\\Pictures\\2004\\2004.12.21~24 - 刚到新加坡,bukit timah hill, befriender, 猴子, oldham hall及附近\\20041221-1011-2.jpg',
 'D:\\Pictures\\2004\\2004.12.21~24 - 刚到新加坡,bukit timah hill, befriender, 猴子, oldham hall及附近\\20041221-1011.jpg',
 'D:\\Pictures\\2004\\2004.12.21~24 - 刚到新加坡,bukit timah hill, befriender, 猴子, oldham hall及附近\\20041221-1013.jpg',
 'D:\\Pictures\\2004\\2004.12.21~24 - 刚到新加坡,bukit timah hill, befriender, 猴子, oldham hall及附近\\20041221-1203.jpg',
 'D:\\Pictures\\2004\\2004.12.21~24 - 刚到新加坡,bukit timah hill, befriender, 猴子, oldham hall及附近\\20041221-1219.jpg',
 'D:\\Pictures\\2004\\2004.12.21~24 - 刚到新加坡,bukit timah hill, befriender, 猴子, oldham hall及附近\\20041221-1225.jpg']
In [ ]:
#############################
### RUN WITH CAUSION#########
#############################

# Scan through all 40k photos and extract faces
for i in range(len(all_jpg)):
    show_faces(all_jpg[i], False, True)

Manully label the face images by putting them into different folders

There are 100k faces identified. Most of them are non-relevant. I hand picked over 200 of them.

In [8]:
os.listdir('./images')
Out[8]:
['Test', 'Train', 'Validate']
In [9]:
os.listdir('./images/Test')
Out[9]:
['Brother', 'Dad', 'Daughter', 'Me', 'Mum', 'Son', 'Wife']
In [10]:
from sklearn.datasets import load_files
from keras.utils import np_utils
from glob import glob

# Read all the files and return 2 numpy arrays, 1 is the address of the files
# and 1 is the one hot encode of the category.
def load_dataset(path):
    data = load_files(path)
    face_files = np.array(data['filenames'])
    face_targets = np_utils.to_categorical(np.array(data['target']), 7)
    return face_files, face_targets
Using TensorFlow backend.

Part 2 - Transfer learning and train the model

In [36]:
from keras.applications.inception_v3 import InceptionV3
from keras.models import Model
from keras.layers import Dense, GlobalAveragePooling2D
from keras import backend as K
from keras.applications.imagenet_utils import preprocess_input, decode_predictions
from keras.callbacks import ModelCheckpoint, EarlyStopping, LambdaCallback, ReduceLROnPlateau
from keras.models import load_model
In [12]:
base_model = InceptionV3(weights='imagenet', include_top=False)
base_model.summary()
____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
input_1 (InputLayer)             (None, None, None, 3) 0                                            
____________________________________________________________________________________________________
conv2d_1 (Conv2D)                (None, None, None, 32 864         input_1[0][0]                    
____________________________________________________________________________________________________
batch_normalization_1 (BatchNorm (None, None, None, 32 96          conv2d_1[0][0]                   
____________________________________________________________________________________________________
activation_1 (Activation)        (None, None, None, 32 0           batch_normalization_1[0][0]      
____________________________________________________________________________________________________
conv2d_2 (Conv2D)                (None, None, None, 32 9216        activation_1[0][0]               
____________________________________________________________________________________________________
batch_normalization_2 (BatchNorm (None, None, None, 32 96          conv2d_2[0][0]                   
____________________________________________________________________________________________________
activation_2 (Activation)        (None, None, None, 32 0           batch_normalization_2[0][0]      
____________________________________________________________________________________________________
conv2d_3 (Conv2D)                (None, None, None, 64 18432       activation_2[0][0]               
____________________________________________________________________________________________________
batch_normalization_3 (BatchNorm (None, None, None, 64 192         conv2d_3[0][0]                   
____________________________________________________________________________________________________
activation_3 (Activation)        (None, None, None, 64 0           batch_normalization_3[0][0]      
____________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D)   (None, None, None, 64 0           activation_3[0][0]               
____________________________________________________________________________________________________
conv2d_4 (Conv2D)                (None, None, None, 80 5120        max_pooling2d_1[0][0]            
____________________________________________________________________________________________________
batch_normalization_4 (BatchNorm (None, None, None, 80 240         conv2d_4[0][0]                   
____________________________________________________________________________________________________
activation_4 (Activation)        (None, None, None, 80 0           batch_normalization_4[0][0]      
____________________________________________________________________________________________________
conv2d_5 (Conv2D)                (None, None, None, 19 138240      activation_4[0][0]               
____________________________________________________________________________________________________
batch_normalization_5 (BatchNorm (None, None, None, 19 576         conv2d_5[0][0]                   
____________________________________________________________________________________________________
activation_5 (Activation)        (None, None, None, 19 0           batch_normalization_5[0][0]      
____________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D)   (None, None, None, 19 0           activation_5[0][0]               
____________________________________________________________________________________________________
conv2d_9 (Conv2D)                (None, None, None, 64 12288       max_pooling2d_2[0][0]            
____________________________________________________________________________________________________
batch_normalization_9 (BatchNorm (None, None, None, 64 192         conv2d_9[0][0]                   
____________________________________________________________________________________________________
activation_9 (Activation)        (None, None, None, 64 0           batch_normalization_9[0][0]      
____________________________________________________________________________________________________
conv2d_7 (Conv2D)                (None, None, None, 48 9216        max_pooling2d_2[0][0]            
____________________________________________________________________________________________________
conv2d_10 (Conv2D)               (None, None, None, 96 55296       activation_9[0][0]               
____________________________________________________________________________________________________
batch_normalization_7 (BatchNorm (None, None, None, 48 144         conv2d_7[0][0]                   
____________________________________________________________________________________________________
batch_normalization_10 (BatchNor (None, None, None, 96 288         conv2d_10[0][0]                  
____________________________________________________________________________________________________
activation_7 (Activation)        (None, None, None, 48 0           batch_normalization_7[0][0]      
____________________________________________________________________________________________________
activation_10 (Activation)       (None, None, None, 96 0           batch_normalization_10[0][0]     
____________________________________________________________________________________________________
average_pooling2d_1 (AveragePool (None, None, None, 19 0           max_pooling2d_2[0][0]            
____________________________________________________________________________________________________
conv2d_6 (Conv2D)                (None, None, None, 64 12288       max_pooling2d_2[0][0]            
____________________________________________________________________________________________________
conv2d_8 (Conv2D)                (None, None, None, 64 76800       activation_7[0][0]               
____________________________________________________________________________________________________
conv2d_11 (Conv2D)               (None, None, None, 96 82944       activation_10[0][0]              
____________________________________________________________________________________________________
conv2d_12 (Conv2D)               (None, None, None, 32 6144        average_pooling2d_1[0][0]        
____________________________________________________________________________________________________
batch_normalization_6 (BatchNorm (None, None, None, 64 192         conv2d_6[0][0]                   
____________________________________________________________________________________________________
batch_normalization_8 (BatchNorm (None, None, None, 64 192         conv2d_8[0][0]                   
____________________________________________________________________________________________________
batch_normalization_11 (BatchNor (None, None, None, 96 288         conv2d_11[0][0]                  
____________________________________________________________________________________________________
batch_normalization_12 (BatchNor (None, None, None, 32 96          conv2d_12[0][0]                  
____________________________________________________________________________________________________
activation_6 (Activation)        (None, None, None, 64 0           batch_normalization_6[0][0]      
____________________________________________________________________________________________________
activation_8 (Activation)        (None, None, None, 64 0           batch_normalization_8[0][0]      
____________________________________________________________________________________________________
activation_11 (Activation)       (None, None, None, 96 0           batch_normalization_11[0][0]     
____________________________________________________________________________________________________
activation_12 (Activation)       (None, None, None, 32 0           batch_normalization_12[0][0]     
____________________________________________________________________________________________________
mixed0 (Concatenate)             (None, None, None, 25 0           activation_6[0][0]               
                                                                   activation_8[0][0]               
                                                                   activation_11[0][0]              
                                                                   activation_12[0][0]              
____________________________________________________________________________________________________
conv2d_16 (Conv2D)               (None, None, None, 64 16384       mixed0[0][0]                     
____________________________________________________________________________________________________
batch_normalization_16 (BatchNor (None, None, None, 64 192         conv2d_16[0][0]                  
____________________________________________________________________________________________________
activation_16 (Activation)       (None, None, None, 64 0           batch_normalization_16[0][0]     
____________________________________________________________________________________________________
conv2d_14 (Conv2D)               (None, None, None, 48 12288       mixed0[0][0]                     
____________________________________________________________________________________________________
conv2d_17 (Conv2D)               (None, None, None, 96 55296       activation_16[0][0]              
____________________________________________________________________________________________________
batch_normalization_14 (BatchNor (None, None, None, 48 144         conv2d_14[0][0]                  
____________________________________________________________________________________________________
batch_normalization_17 (BatchNor (None, None, None, 96 288         conv2d_17[0][0]                  
____________________________________________________________________________________________________
activation_14 (Activation)       (None, None, None, 48 0           batch_normalization_14[0][0]     
____________________________________________________________________________________________________
activation_17 (Activation)       (None, None, None, 96 0           batch_normalization_17[0][0]     
____________________________________________________________________________________________________
average_pooling2d_2 (AveragePool (None, None, None, 25 0           mixed0[0][0]                     
____________________________________________________________________________________________________
conv2d_13 (Conv2D)               (None, None, None, 64 16384       mixed0[0][0]                     
____________________________________________________________________________________________________
conv2d_15 (Conv2D)               (None, None, None, 64 76800       activation_14[0][0]              
____________________________________________________________________________________________________
conv2d_18 (Conv2D)               (None, None, None, 96 82944       activation_17[0][0]              
____________________________________________________________________________________________________
conv2d_19 (Conv2D)               (None, None, None, 64 16384       average_pooling2d_2[0][0]        
____________________________________________________________________________________________________
batch_normalization_13 (BatchNor (None, None, None, 64 192         conv2d_13[0][0]                  
____________________________________________________________________________________________________
batch_normalization_15 (BatchNor (None, None, None, 64 192         conv2d_15[0][0]                  
____________________________________________________________________________________________________
batch_normalization_18 (BatchNor (None, None, None, 96 288         conv2d_18[0][0]                  
____________________________________________________________________________________________________
batch_normalization_19 (BatchNor (None, None, None, 64 192         conv2d_19[0][0]                  
____________________________________________________________________________________________________
activation_13 (Activation)       (None, None, None, 64 0           batch_normalization_13[0][0]     
____________________________________________________________________________________________________
activation_15 (Activation)       (None, None, None, 64 0           batch_normalization_15[0][0]     
____________________________________________________________________________________________________
activation_18 (Activation)       (None, None, None, 96 0           batch_normalization_18[0][0]     
____________________________________________________________________________________________________
activation_19 (Activation)       (None, None, None, 64 0           batch_normalization_19[0][0]     
____________________________________________________________________________________________________
mixed1 (Concatenate)             (None, None, None, 28 0           activation_13[0][0]              
                                                                   activation_15[0][0]              
                                                                   activation_18[0][0]              
                                                                   activation_19[0][0]              
____________________________________________________________________________________________________
conv2d_23 (Conv2D)               (None, None, None, 64 18432       mixed1[0][0]                     
____________________________________________________________________________________________________
batch_normalization_23 (BatchNor (None, None, None, 64 192         conv2d_23[0][0]                  
____________________________________________________________________________________________________
activation_23 (Activation)       (None, None, None, 64 0           batch_normalization_23[0][0]     
____________________________________________________________________________________________________
conv2d_21 (Conv2D)               (None, None, None, 48 13824       mixed1[0][0]                     
____________________________________________________________________________________________________
conv2d_24 (Conv2D)               (None, None, None, 96 55296       activation_23[0][0]              
____________________________________________________________________________________________________
batch_normalization_21 (BatchNor (None, None, None, 48 144         conv2d_21[0][0]                  
____________________________________________________________________________________________________
batch_normalization_24 (BatchNor (None, None, None, 96 288         conv2d_24[0][0]                  
____________________________________________________________________________________________________
activation_21 (Activation)       (None, None, None, 48 0           batch_normalization_21[0][0]     
____________________________________________________________________________________________________
activation_24 (Activation)       (None, None, None, 96 0           batch_normalization_24[0][0]     
____________________________________________________________________________________________________
average_pooling2d_3 (AveragePool (None, None, None, 28 0           mixed1[0][0]                     
____________________________________________________________________________________________________
conv2d_20 (Conv2D)               (None, None, None, 64 18432       mixed1[0][0]                     
____________________________________________________________________________________________________
conv2d_22 (Conv2D)               (None, None, None, 64 76800       activation_21[0][0]              
____________________________________________________________________________________________________
conv2d_25 (Conv2D)               (None, None, None, 96 82944       activation_24[0][0]              
____________________________________________________________________________________________________
conv2d_26 (Conv2D)               (None, None, None, 64 18432       average_pooling2d_3[0][0]        
____________________________________________________________________________________________________
batch_normalization_20 (BatchNor (None, None, None, 64 192         conv2d_20[0][0]                  
____________________________________________________________________________________________________
batch_normalization_22 (BatchNor (None, None, None, 64 192         conv2d_22[0][0]                  
____________________________________________________________________________________________________
batch_normalization_25 (BatchNor (None, None, None, 96 288         conv2d_25[0][0]                  
____________________________________________________________________________________________________
batch_normalization_26 (BatchNor (None, None, None, 64 192         conv2d_26[0][0]                  
____________________________________________________________________________________________________
activation_20 (Activation)       (None, None, None, 64 0           batch_normalization_20[0][0]     
____________________________________________________________________________________________________
activation_22 (Activation)       (None, None, None, 64 0           batch_normalization_22[0][0]     
____________________________________________________________________________________________________
activation_25 (Activation)       (None, None, None, 96 0           batch_normalization_25[0][0]     
____________________________________________________________________________________________________
activation_26 (Activation)       (None, None, None, 64 0           batch_normalization_26[0][0]     
____________________________________________________________________________________________________
mixed2 (Concatenate)             (None, None, None, 28 0           activation_20[0][0]              
                                                                   activation_22[0][0]              
                                                                   activation_25[0][0]              
                                                                   activation_26[0][0]              
____________________________________________________________________________________________________
conv2d_28 (Conv2D)               (None, None, None, 64 18432       mixed2[0][0]                     
____________________________________________________________________________________________________
batch_normalization_28 (BatchNor (None, None, None, 64 192         conv2d_28[0][0]                  
____________________________________________________________________________________________________
activation_28 (Activation)       (None, None, None, 64 0           batch_normalization_28[0][0]     
____________________________________________________________________________________________________
conv2d_29 (Conv2D)               (None, None, None, 96 55296       activation_28[0][0]              
____________________________________________________________________________________________________
batch_normalization_29 (BatchNor (None, None, None, 96 288         conv2d_29[0][0]                  
____________________________________________________________________________________________________
activation_29 (Activation)       (None, None, None, 96 0           batch_normalization_29[0][0]     
____________________________________________________________________________________________________
conv2d_27 (Conv2D)               (None, None, None, 38 995328      mixed2[0][0]                     
____________________________________________________________________________________________________
conv2d_30 (Conv2D)               (None, None, None, 96 82944       activation_29[0][0]              
____________________________________________________________________________________________________
batch_normalization_27 (BatchNor (None, None, None, 38 1152        conv2d_27[0][0]                  
____________________________________________________________________________________________________
batch_normalization_30 (BatchNor (None, None, None, 96 288         conv2d_30[0][0]                  
____________________________________________________________________________________________________
activation_27 (Activation)       (None, None, None, 38 0           batch_normalization_27[0][0]     
____________________________________________________________________________________________________
activation_30 (Activation)       (None, None, None, 96 0           batch_normalization_30[0][0]     
____________________________________________________________________________________________________
max_pooling2d_3 (MaxPooling2D)   (None, None, None, 28 0           mixed2[0][0]                     
____________________________________________________________________________________________________
mixed3 (Concatenate)             (None, None, None, 76 0           activation_27[0][0]              
                                                                   activation_30[0][0]              
                                                                   max_pooling2d_3[0][0]            
____________________________________________________________________________________________________
conv2d_35 (Conv2D)               (None, None, None, 12 98304       mixed3[0][0]                     
____________________________________________________________________________________________________
batch_normalization_35 (BatchNor (None, None, None, 12 384         conv2d_35[0][0]                  
____________________________________________________________________________________________________
activation_35 (Activation)       (None, None, None, 12 0           batch_normalization_35[0][0]     
____________________________________________________________________________________________________
conv2d_36 (Conv2D)               (None, None, None, 12 114688      activation_35[0][0]              
____________________________________________________________________________________________________
batch_normalization_36 (BatchNor (None, None, None, 12 384         conv2d_36[0][0]                  
____________________________________________________________________________________________________
activation_36 (Activation)       (None, None, None, 12 0           batch_normalization_36[0][0]     
____________________________________________________________________________________________________
conv2d_32 (Conv2D)               (None, None, None, 12 98304       mixed3[0][0]                     
____________________________________________________________________________________________________
conv2d_37 (Conv2D)               (None, None, None, 12 114688      activation_36[0][0]              
____________________________________________________________________________________________________
batch_normalization_32 (BatchNor (None, None, None, 12 384         conv2d_32[0][0]                  
____________________________________________________________________________________________________
batch_normalization_37 (BatchNor (None, None, None, 12 384         conv2d_37[0][0]                  
____________________________________________________________________________________________________
activation_32 (Activation)       (None, None, None, 12 0           batch_normalization_32[0][0]     
____________________________________________________________________________________________________
activation_37 (Activation)       (None, None, None, 12 0           batch_normalization_37[0][0]     
____________________________________________________________________________________________________
conv2d_33 (Conv2D)               (None, None, None, 12 114688      activation_32[0][0]              
____________________________________________________________________________________________________
conv2d_38 (Conv2D)               (None, None, None, 12 114688      activation_37[0][0]              
____________________________________________________________________________________________________
batch_normalization_33 (BatchNor (None, None, None, 12 384         conv2d_33[0][0]                  
____________________________________________________________________________________________________
batch_normalization_38 (BatchNor (None, None, None, 12 384         conv2d_38[0][0]                  
____________________________________________________________________________________________________
activation_33 (Activation)       (None, None, None, 12 0           batch_normalization_33[0][0]     
____________________________________________________________________________________________________
activation_38 (Activation)       (None, None, None, 12 0           batch_normalization_38[0][0]     
____________________________________________________________________________________________________
average_pooling2d_4 (AveragePool (None, None, None, 76 0           mixed3[0][0]                     
____________________________________________________________________________________________________
conv2d_31 (Conv2D)               (None, None, None, 19 147456      mixed3[0][0]                     
____________________________________________________________________________________________________
conv2d_34 (Conv2D)               (None, None, None, 19 172032      activation_33[0][0]              
____________________________________________________________________________________________________
conv2d_39 (Conv2D)               (None, None, None, 19 172032      activation_38[0][0]              
____________________________________________________________________________________________________
conv2d_40 (Conv2D)               (None, None, None, 19 147456      average_pooling2d_4[0][0]        
____________________________________________________________________________________________________
batch_normalization_31 (BatchNor (None, None, None, 19 576         conv2d_31[0][0]                  
____________________________________________________________________________________________________
batch_normalization_34 (BatchNor (None, None, None, 19 576         conv2d_34[0][0]                  
____________________________________________________________________________________________________
batch_normalization_39 (BatchNor (None, None, None, 19 576         conv2d_39[0][0]                  
____________________________________________________________________________________________________
batch_normalization_40 (BatchNor (None, None, None, 19 576         conv2d_40[0][0]                  
____________________________________________________________________________________________________
activation_31 (Activation)       (None, None, None, 19 0           batch_normalization_31[0][0]     
____________________________________________________________________________________________________
activation_34 (Activation)       (None, None, None, 19 0           batch_normalization_34[0][0]     
____________________________________________________________________________________________________
activation_39 (Activation)       (None, None, None, 19 0           batch_normalization_39[0][0]     
____________________________________________________________________________________________________
activation_40 (Activation)       (None, None, None, 19 0           batch_normalization_40[0][0]     
____________________________________________________________________________________________________
mixed4 (Concatenate)             (None, None, None, 76 0           activation_31[0][0]              
                                                                   activation_34[0][0]              
                                                                   activation_39[0][0]              
                                                                   activation_40[0][0]              
____________________________________________________________________________________________________
conv2d_45 (Conv2D)               (None, None, None, 16 122880      mixed4[0][0]                     
____________________________________________________________________________________________________
batch_normalization_45 (BatchNor (None, None, None, 16 480         conv2d_45[0][0]                  
____________________________________________________________________________________________________
activation_45 (Activation)       (None, None, None, 16 0           batch_normalization_45[0][0]     
____________________________________________________________________________________________________
conv2d_46 (Conv2D)               (None, None, None, 16 179200      activation_45[0][0]              
____________________________________________________________________________________________________
batch_normalization_46 (BatchNor (None, None, None, 16 480         conv2d_46[0][0]                  
____________________________________________________________________________________________________
activation_46 (Activation)       (None, None, None, 16 0           batch_normalization_46[0][0]     
____________________________________________________________________________________________________
conv2d_42 (Conv2D)               (None, None, None, 16 122880      mixed4[0][0]                     
____________________________________________________________________________________________________
conv2d_47 (Conv2D)               (None, None, None, 16 179200      activation_46[0][0]              
____________________________________________________________________________________________________
batch_normalization_42 (BatchNor (None, None, None, 16 480         conv2d_42[0][0]                  
____________________________________________________________________________________________________
batch_normalization_47 (BatchNor (None, None, None, 16 480         conv2d_47[0][0]                  
____________________________________________________________________________________________________
activation_42 (Activation)       (None, None, None, 16 0           batch_normalization_42[0][0]     
____________________________________________________________________________________________________
activation_47 (Activation)       (None, None, None, 16 0           batch_normalization_47[0][0]     
____________________________________________________________________________________________________
conv2d_43 (Conv2D)               (None, None, None, 16 179200      activation_42[0][0]              
____________________________________________________________________________________________________
conv2d_48 (Conv2D)               (None, None, None, 16 179200      activation_47[0][0]              
____________________________________________________________________________________________________
batch_normalization_43 (BatchNor (None, None, None, 16 480         conv2d_43[0][0]                  
____________________________________________________________________________________________________
batch_normalization_48 (BatchNor (None, None, None, 16 480         conv2d_48[0][0]                  
____________________________________________________________________________________________________
activation_43 (Activation)       (None, None, None, 16 0           batch_normalization_43[0][0]     
____________________________________________________________________________________________________
activation_48 (Activation)       (None, None, None, 16 0           batch_normalization_48[0][0]     
____________________________________________________________________________________________________
average_pooling2d_5 (AveragePool (None, None, None, 76 0           mixed4[0][0]                     
____________________________________________________________________________________________________
conv2d_41 (Conv2D)               (None, None, None, 19 147456      mixed4[0][0]                     
____________________________________________________________________________________________________
conv2d_44 (Conv2D)               (None, None, None, 19 215040      activation_43[0][0]              
____________________________________________________________________________________________________
conv2d_49 (Conv2D)               (None, None, None, 19 215040      activation_48[0][0]              
____________________________________________________________________________________________________
conv2d_50 (Conv2D)               (None, None, None, 19 147456      average_pooling2d_5[0][0]        
____________________________________________________________________________________________________
batch_normalization_41 (BatchNor (None, None, None, 19 576         conv2d_41[0][0]                  
____________________________________________________________________________________________________
batch_normalization_44 (BatchNor (None, None, None, 19 576         conv2d_44[0][0]                  
____________________________________________________________________________________________________
batch_normalization_49 (BatchNor (None, None, None, 19 576         conv2d_49[0][0]                  
____________________________________________________________________________________________________
batch_normalization_50 (BatchNor (None, None, None, 19 576         conv2d_50[0][0]                  
____________________________________________________________________________________________________
activation_41 (Activation)       (None, None, None, 19 0           batch_normalization_41[0][0]     
____________________________________________________________________________________________________
activation_44 (Activation)       (None, None, None, 19 0           batch_normalization_44[0][0]     
____________________________________________________________________________________________________
activation_49 (Activation)       (None, None, None, 19 0           batch_normalization_49[0][0]     
____________________________________________________________________________________________________
activation_50 (Activation)       (None, None, None, 19 0           batch_normalization_50[0][0]     
____________________________________________________________________________________________________
mixed5 (Concatenate)             (None, None, None, 76 0           activation_41[0][0]              
                                                                   activation_44[0][0]              
                                                                   activation_49[0][0]              
                                                                   activation_50[0][0]              
____________________________________________________________________________________________________
conv2d_55 (Conv2D)               (None, None, None, 16 122880      mixed5[0][0]                     
____________________________________________________________________________________________________
batch_normalization_55 (BatchNor (None, None, None, 16 480         conv2d_55[0][0]                  
____________________________________________________________________________________________________
activation_55 (Activation)       (None, None, None, 16 0           batch_normalization_55[0][0]     
____________________________________________________________________________________________________
conv2d_56 (Conv2D)               (None, None, None, 16 179200      activation_55[0][0]              
____________________________________________________________________________________________________
batch_normalization_56 (BatchNor (None, None, None, 16 480         conv2d_56[0][0]                  
____________________________________________________________________________________________________
activation_56 (Activation)       (None, None, None, 16 0           batch_normalization_56[0][0]     
____________________________________________________________________________________________________
conv2d_52 (Conv2D)               (None, None, None, 16 122880      mixed5[0][0]                     
____________________________________________________________________________________________________
conv2d_57 (Conv2D)               (None, None, None, 16 179200      activation_56[0][0]              
____________________________________________________________________________________________________
batch_normalization_52 (BatchNor (None, None, None, 16 480         conv2d_52[0][0]                  
____________________________________________________________________________________________________
batch_normalization_57 (BatchNor (None, None, None, 16 480         conv2d_57[0][0]                  
____________________________________________________________________________________________________
activation_52 (Activation)       (None, None, None, 16 0           batch_normalization_52[0][0]     
____________________________________________________________________________________________________
activation_57 (Activation)       (None, None, None, 16 0           batch_normalization_57[0][0]     
____________________________________________________________________________________________________
conv2d_53 (Conv2D)               (None, None, None, 16 179200      activation_52[0][0]              
____________________________________________________________________________________________________
conv2d_58 (Conv2D)               (None, None, None, 16 179200      activation_57[0][0]              
____________________________________________________________________________________________________
batch_normalization_53 (BatchNor (None, None, None, 16 480         conv2d_53[0][0]                  
____________________________________________________________________________________________________
batch_normalization_58 (BatchNor (None, None, None, 16 480         conv2d_58[0][0]                  
____________________________________________________________________________________________________
activation_53 (Activation)       (None, None, None, 16 0           batch_normalization_53[0][0]     
____________________________________________________________________________________________________
activation_58 (Activation)       (None, None, None, 16 0           batch_normalization_58[0][0]     
____________________________________________________________________________________________________
average_pooling2d_6 (AveragePool (None, None, None, 76 0           mixed5[0][0]                     
____________________________________________________________________________________________________
conv2d_51 (Conv2D)               (None, None, None, 19 147456      mixed5[0][0]                     
____________________________________________________________________________________________________
conv2d_54 (Conv2D)               (None, None, None, 19 215040      activation_53[0][0]              
____________________________________________________________________________________________________
conv2d_59 (Conv2D)               (None, None, None, 19 215040      activation_58[0][0]              
____________________________________________________________________________________________________
conv2d_60 (Conv2D)               (None, None, None, 19 147456      average_pooling2d_6[0][0]        
____________________________________________________________________________________________________
batch_normalization_51 (BatchNor (None, None, None, 19 576         conv2d_51[0][0]                  
____________________________________________________________________________________________________
batch_normalization_54 (BatchNor (None, None, None, 19 576         conv2d_54[0][0]                  
____________________________________________________________________________________________________
batch_normalization_59 (BatchNor (None, None, None, 19 576         conv2d_59[0][0]                  
____________________________________________________________________________________________________
batch_normalization_60 (BatchNor (None, None, None, 19 576         conv2d_60[0][0]                  
____________________________________________________________________________________________________
activation_51 (Activation)       (None, None, None, 19 0           batch_normalization_51[0][0]     
____________________________________________________________________________________________________
activation_54 (Activation)       (None, None, None, 19 0           batch_normalization_54[0][0]     
____________________________________________________________________________________________________
activation_59 (Activation)       (None, None, None, 19 0           batch_normalization_59[0][0]     
____________________________________________________________________________________________________
activation_60 (Activation)       (None, None, None, 19 0           batch_normalization_60[0][0]     
____________________________________________________________________________________________________
mixed6 (Concatenate)             (None, None, None, 76 0           activation_51[0][0]              
                                                                   activation_54[0][0]              
                                                                   activation_59[0][0]              
                                                                   activation_60[0][0]              
____________________________________________________________________________________________________
conv2d_65 (Conv2D)               (None, None, None, 19 147456      mixed6[0][0]                     
____________________________________________________________________________________________________
batch_normalization_65 (BatchNor (None, None, None, 19 576         conv2d_65[0][0]                  
____________________________________________________________________________________________________
activation_65 (Activation)       (None, None, None, 19 0           batch_normalization_65[0][0]     
____________________________________________________________________________________________________
conv2d_66 (Conv2D)               (None, None, None, 19 258048      activation_65[0][0]              
____________________________________________________________________________________________________
batch_normalization_66 (BatchNor (None, None, None, 19 576         conv2d_66[0][0]                  
____________________________________________________________________________________________________
activation_66 (Activation)       (None, None, None, 19 0           batch_normalization_66[0][0]     
____________________________________________________________________________________________________
conv2d_62 (Conv2D)               (None, None, None, 19 147456      mixed6[0][0]                     
____________________________________________________________________________________________________
conv2d_67 (Conv2D)               (None, None, None, 19 258048      activation_66[0][0]              
____________________________________________________________________________________________________
batch_normalization_62 (BatchNor (None, None, None, 19 576         conv2d_62[0][0]                  
____________________________________________________________________________________________________
batch_normalization_67 (BatchNor (None, None, None, 19 576         conv2d_67[0][0]                  
____________________________________________________________________________________________________
activation_62 (Activation)       (None, None, None, 19 0           batch_normalization_62[0][0]     
____________________________________________________________________________________________________
activation_67 (Activation)       (None, None, None, 19 0           batch_normalization_67[0][0]     
____________________________________________________________________________________________________
conv2d_63 (Conv2D)               (None, None, None, 19 258048      activation_62[0][0]              
____________________________________________________________________________________________________
conv2d_68 (Conv2D)               (None, None, None, 19 258048      activation_67[0][0]              
____________________________________________________________________________________________________
batch_normalization_63 (BatchNor (None, None, None, 19 576         conv2d_63[0][0]                  
____________________________________________________________________________________________________
batch_normalization_68 (BatchNor (None, None, None, 19 576         conv2d_68[0][0]                  
____________________________________________________________________________________________________
activation_63 (Activation)       (None, None, None, 19 0           batch_normalization_63[0][0]     
____________________________________________________________________________________________________
activation_68 (Activation)       (None, None, None, 19 0           batch_normalization_68[0][0]     
____________________________________________________________________________________________________
average_pooling2d_7 (AveragePool (None, None, None, 76 0           mixed6[0][0]                     
____________________________________________________________________________________________________
conv2d_61 (Conv2D)               (None, None, None, 19 147456      mixed6[0][0]                     
____________________________________________________________________________________________________
conv2d_64 (Conv2D)               (None, None, None, 19 258048      activation_63[0][0]              
____________________________________________________________________________________________________
conv2d_69 (Conv2D)               (None, None, None, 19 258048      activation_68[0][0]              
____________________________________________________________________________________________________
conv2d_70 (Conv2D)               (None, None, None, 19 147456      average_pooling2d_7[0][0]        
____________________________________________________________________________________________________
batch_normalization_61 (BatchNor (None, None, None, 19 576         conv2d_61[0][0]                  
____________________________________________________________________________________________________
batch_normalization_64 (BatchNor (None, None, None, 19 576         conv2d_64[0][0]                  
____________________________________________________________________________________________________
batch_normalization_69 (BatchNor (None, None, None, 19 576         conv2d_69[0][0]                  
____________________________________________________________________________________________________
batch_normalization_70 (BatchNor (None, None, None, 19 576         conv2d_70[0][0]                  
____________________________________________________________________________________________________
activation_61 (Activation)       (None, None, None, 19 0           batch_normalization_61[0][0]     
____________________________________________________________________________________________________
activation_64 (Activation)       (None, None, None, 19 0           batch_normalization_64[0][0]     
____________________________________________________________________________________________________
activation_69 (Activation)       (None, None, None, 19 0           batch_normalization_69[0][0]     
____________________________________________________________________________________________________
activation_70 (Activation)       (None, None, None, 19 0           batch_normalization_70[0][0]     
____________________________________________________________________________________________________
mixed7 (Concatenate)             (None, None, None, 76 0           activation_61[0][0]              
                                                                   activation_64[0][0]              
                                                                   activation_69[0][0]              
                                                                   activation_70[0][0]              
____________________________________________________________________________________________________
conv2d_73 (Conv2D)               (None, None, None, 19 147456      mixed7[0][0]                     
____________________________________________________________________________________________________
batch_normalization_73 (BatchNor (None, None, None, 19 576         conv2d_73[0][0]                  
____________________________________________________________________________________________________
activation_73 (Activation)       (None, None, None, 19 0           batch_normalization_73[0][0]     
____________________________________________________________________________________________________
conv2d_74 (Conv2D)               (None, None, None, 19 258048      activation_73[0][0]              
____________________________________________________________________________________________________
batch_normalization_74 (BatchNor (None, None, None, 19 576         conv2d_74[0][0]                  
____________________________________________________________________________________________________
activation_74 (Activation)       (None, None, None, 19 0           batch_normalization_74[0][0]     
____________________________________________________________________________________________________
conv2d_71 (Conv2D)               (None, None, None, 19 147456      mixed7[0][0]                     
____________________________________________________________________________________________________
conv2d_75 (Conv2D)               (None, None, None, 19 258048      activation_74[0][0]              
____________________________________________________________________________________________________
batch_normalization_71 (BatchNor (None, None, None, 19 576         conv2d_71[0][0]                  
____________________________________________________________________________________________________
batch_normalization_75 (BatchNor (None, None, None, 19 576         conv2d_75[0][0]                  
____________________________________________________________________________________________________
activation_71 (Activation)       (None, None, None, 19 0           batch_normalization_71[0][0]     
____________________________________________________________________________________________________
activation_75 (Activation)       (None, None, None, 19 0           batch_normalization_75[0][0]     
____________________________________________________________________________________________________
conv2d_72 (Conv2D)               (None, None, None, 32 552960      activation_71[0][0]              
____________________________________________________________________________________________________
conv2d_76 (Conv2D)               (None, None, None, 19 331776      activation_75[0][0]              
____________________________________________________________________________________________________
batch_normalization_72 (BatchNor (None, None, None, 32 960         conv2d_72[0][0]                  
____________________________________________________________________________________________________
batch_normalization_76 (BatchNor (None, None, None, 19 576         conv2d_76[0][0]                  
____________________________________________________________________________________________________
activation_72 (Activation)       (None, None, None, 32 0           batch_normalization_72[0][0]     
____________________________________________________________________________________________________
activation_76 (Activation)       (None, None, None, 19 0           batch_normalization_76[0][0]     
____________________________________________________________________________________________________
max_pooling2d_4 (MaxPooling2D)   (None, None, None, 76 0           mixed7[0][0]                     
____________________________________________________________________________________________________
mixed8 (Concatenate)             (None, None, None, 12 0           activation_72[0][0]              
                                                                   activation_76[0][0]              
                                                                   max_pooling2d_4[0][0]            
____________________________________________________________________________________________________
conv2d_81 (Conv2D)               (None, None, None, 44 573440      mixed8[0][0]                     
____________________________________________________________________________________________________
batch_normalization_81 (BatchNor (None, None, None, 44 1344        conv2d_81[0][0]                  
____________________________________________________________________________________________________
activation_81 (Activation)       (None, None, None, 44 0           batch_normalization_81[0][0]     
____________________________________________________________________________________________________
conv2d_78 (Conv2D)               (None, None, None, 38 491520      mixed8[0][0]                     
____________________________________________________________________________________________________
conv2d_82 (Conv2D)               (None, None, None, 38 1548288     activation_81[0][0]              
____________________________________________________________________________________________________
batch_normalization_78 (BatchNor (None, None, None, 38 1152        conv2d_78[0][0]                  
____________________________________________________________________________________________________
batch_normalization_82 (BatchNor (None, None, None, 38 1152        conv2d_82[0][0]                  
____________________________________________________________________________________________________
activation_78 (Activation)       (None, None, None, 38 0           batch_normalization_78[0][0]     
____________________________________________________________________________________________________
activation_82 (Activation)       (None, None, None, 38 0           batch_normalization_82[0][0]     
____________________________________________________________________________________________________
conv2d_79 (Conv2D)               (None, None, None, 38 442368      activation_78[0][0]              
____________________________________________________________________________________________________
conv2d_80 (Conv2D)               (None, None, None, 38 442368      activation_78[0][0]              
____________________________________________________________________________________________________
conv2d_83 (Conv2D)               (None, None, None, 38 442368      activation_82[0][0]              
____________________________________________________________________________________________________
conv2d_84 (Conv2D)               (None, None, None, 38 442368      activation_82[0][0]              
____________________________________________________________________________________________________
average_pooling2d_8 (AveragePool (None, None, None, 12 0           mixed8[0][0]                     
____________________________________________________________________________________________________
conv2d_77 (Conv2D)               (None, None, None, 32 409600      mixed8[0][0]                     
____________________________________________________________________________________________________
batch_normalization_79 (BatchNor (None, None, None, 38 1152        conv2d_79[0][0]                  
____________________________________________________________________________________________________
batch_normalization_80 (BatchNor (None, None, None, 38 1152        conv2d_80[0][0]                  
____________________________________________________________________________________________________
batch_normalization_83 (BatchNor (None, None, None, 38 1152        conv2d_83[0][0]                  
____________________________________________________________________________________________________
batch_normalization_84 (BatchNor (None, None, None, 38 1152        conv2d_84[0][0]                  
____________________________________________________________________________________________________
conv2d_85 (Conv2D)               (None, None, None, 19 245760      average_pooling2d_8[0][0]        
____________________________________________________________________________________________________
batch_normalization_77 (BatchNor (None, None, None, 32 960         conv2d_77[0][0]                  
____________________________________________________________________________________________________
activation_79 (Activation)       (None, None, None, 38 0           batch_normalization_79[0][0]     
____________________________________________________________________________________________________
activation_80 (Activation)       (None, None, None, 38 0           batch_normalization_80[0][0]     
____________________________________________________________________________________________________
activation_83 (Activation)       (None, None, None, 38 0           batch_normalization_83[0][0]     
____________________________________________________________________________________________________
activation_84 (Activation)       (None, None, None, 38 0           batch_normalization_84[0][0]     
____________________________________________________________________________________________________
batch_normalization_85 (BatchNor (None, None, None, 19 576         conv2d_85[0][0]                  
____________________________________________________________________________________________________
activation_77 (Activation)       (None, None, None, 32 0           batch_normalization_77[0][0]     
____________________________________________________________________________________________________
mixed9_0 (Concatenate)           (None, None, None, 76 0           activation_79[0][0]              
                                                                   activation_80[0][0]              
____________________________________________________________________________________________________
concatenate_1 (Concatenate)      (None, None, None, 76 0           activation_83[0][0]              
                                                                   activation_84[0][0]              
____________________________________________________________________________________________________
activation_85 (Activation)       (None, None, None, 19 0           batch_normalization_85[0][0]     
____________________________________________________________________________________________________
mixed9 (Concatenate)             (None, None, None, 20 0           activation_77[0][0]              
                                                                   mixed9_0[0][0]                   
                                                                   concatenate_1[0][0]              
                                                                   activation_85[0][0]              
____________________________________________________________________________________________________
conv2d_90 (Conv2D)               (None, None, None, 44 917504      mixed9[0][0]                     
____________________________________________________________________________________________________
batch_normalization_90 (BatchNor (None, None, None, 44 1344        conv2d_90[0][0]                  
____________________________________________________________________________________________________
activation_90 (Activation)       (None, None, None, 44 0           batch_normalization_90[0][0]     
____________________________________________________________________________________________________
conv2d_87 (Conv2D)               (None, None, None, 38 786432      mixed9[0][0]                     
____________________________________________________________________________________________________
conv2d_91 (Conv2D)               (None, None, None, 38 1548288     activation_90[0][0]              
____________________________________________________________________________________________________
batch_normalization_87 (BatchNor (None, None, None, 38 1152        conv2d_87[0][0]                  
____________________________________________________________________________________________________
batch_normalization_91 (BatchNor (None, None, None, 38 1152        conv2d_91[0][0]                  
____________________________________________________________________________________________________
activation_87 (Activation)       (None, None, None, 38 0           batch_normalization_87[0][0]     
____________________________________________________________________________________________________
activation_91 (Activation)       (None, None, None, 38 0           batch_normalization_91[0][0]     
____________________________________________________________________________________________________
conv2d_88 (Conv2D)               (None, None, None, 38 442368      activation_87[0][0]              
____________________________________________________________________________________________________
conv2d_89 (Conv2D)               (None, None, None, 38 442368      activation_87[0][0]              
____________________________________________________________________________________________________
conv2d_92 (Conv2D)               (None, None, None, 38 442368      activation_91[0][0]              
____________________________________________________________________________________________________
conv2d_93 (Conv2D)               (None, None, None, 38 442368      activation_91[0][0]              
____________________________________________________________________________________________________
average_pooling2d_9 (AveragePool (None, None, None, 20 0           mixed9[0][0]                     
____________________________________________________________________________________________________
conv2d_86 (Conv2D)               (None, None, None, 32 655360      mixed9[0][0]                     
____________________________________________________________________________________________________
batch_normalization_88 (BatchNor (None, None, None, 38 1152        conv2d_88[0][0]                  
____________________________________________________________________________________________________
batch_normalization_89 (BatchNor (None, None, None, 38 1152        conv2d_89[0][0]                  
____________________________________________________________________________________________________
batch_normalization_92 (BatchNor (None, None, None, 38 1152        conv2d_92[0][0]                  
____________________________________________________________________________________________________
batch_normalization_93 (BatchNor (None, None, None, 38 1152        conv2d_93[0][0]                  
____________________________________________________________________________________________________
conv2d_94 (Conv2D)               (None, None, None, 19 393216      average_pooling2d_9[0][0]        
____________________________________________________________________________________________________
batch_normalization_86 (BatchNor (None, None, None, 32 960         conv2d_86[0][0]                  
____________________________________________________________________________________________________
activation_88 (Activation)       (None, None, None, 38 0           batch_normalization_88[0][0]     
____________________________________________________________________________________________________
activation_89 (Activation)       (None, None, None, 38 0           batch_normalization_89[0][0]     
____________________________________________________________________________________________________
activation_92 (Activation)       (None, None, None, 38 0           batch_normalization_92[0][0]     
____________________________________________________________________________________________________
activation_93 (Activation)       (None, None, None, 38 0           batch_normalization_93[0][0]     
____________________________________________________________________________________________________
batch_normalization_94 (BatchNor (None, None, None, 19 576         conv2d_94[0][0]                  
____________________________________________________________________________________________________
activation_86 (Activation)       (None, None, None, 32 0           batch_normalization_86[0][0]     
____________________________________________________________________________________________________
mixed9_1 (Concatenate)           (None, None, None, 76 0           activation_88[0][0]              
                                                                   activation_89[0][0]              
____________________________________________________________________________________________________
concatenate_2 (Concatenate)      (None, None, None, 76 0           activation_92[0][0]              
                                                                   activation_93[0][0]              
____________________________________________________________________________________________________
activation_94 (Activation)       (None, None, None, 19 0           batch_normalization_94[0][0]     
____________________________________________________________________________________________________
mixed10 (Concatenate)            (None, None, None, 20 0           activation_86[0][0]              
                                                                   mixed9_1[0][0]                   
                                                                   concatenate_2[0][0]              
                                                                   activation_94[0][0]              
====================================================================================================
Total params: 21,802,784
Trainable params: 21,768,352
Non-trainable params: 34,432
____________________________________________________________________________________________________
In [13]:
for layer in base_model.layers:
    layer.trainable = False
In [21]:
x = base_model.output
x = GlobalAveragePooling2D()(x)
x = Dense(1000, activation = 'relu')(x)
x = Dense(200, activation = 'relu')(x)
x = Dense(50, activation = 'relu')(x)
x = Dense(7, activation = 'softmax')(x)

my_model_1 = Model(inputs = base_model.input, outputs = x)
my_model_1.summary()
____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
input_1 (InputLayer)             (None, None, None, 3) 0                                            
____________________________________________________________________________________________________
conv2d_1 (Conv2D)                (None, None, None, 32 864         input_1[0][0]                    
____________________________________________________________________________________________________
batch_normalization_1 (BatchNorm (None, None, None, 32 96          conv2d_1[0][0]                   
____________________________________________________________________________________________________
activation_1 (Activation)        (None, None, None, 32 0           batch_normalization_1[0][0]      
____________________________________________________________________________________________________
conv2d_2 (Conv2D)                (None, None, None, 32 9216        activation_1[0][0]               
____________________________________________________________________________________________________
batch_normalization_2 (BatchNorm (None, None, None, 32 96          conv2d_2[0][0]                   
____________________________________________________________________________________________________
activation_2 (Activation)        (None, None, None, 32 0           batch_normalization_2[0][0]      
____________________________________________________________________________________________________
conv2d_3 (Conv2D)                (None, None, None, 64 18432       activation_2[0][0]               
____________________________________________________________________________________________________
batch_normalization_3 (BatchNorm (None, None, None, 64 192         conv2d_3[0][0]                   
____________________________________________________________________________________________________
activation_3 (Activation)        (None, None, None, 64 0           batch_normalization_3[0][0]      
____________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D)   (None, None, None, 64 0           activation_3[0][0]               
____________________________________________________________________________________________________
conv2d_4 (Conv2D)                (None, None, None, 80 5120        max_pooling2d_1[0][0]            
____________________________________________________________________________________________________
batch_normalization_4 (BatchNorm (None, None, None, 80 240         conv2d_4[0][0]                   
____________________________________________________________________________________________________
activation_4 (Activation)        (None, None, None, 80 0           batch_normalization_4[0][0]      
____________________________________________________________________________________________________
conv2d_5 (Conv2D)                (None, None, None, 19 138240      activation_4[0][0]               
____________________________________________________________________________________________________
batch_normalization_5 (BatchNorm (None, None, None, 19 576         conv2d_5[0][0]                   
____________________________________________________________________________________________________
activation_5 (Activation)        (None, None, None, 19 0           batch_normalization_5[0][0]      
____________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D)   (None, None, None, 19 0           activation_5[0][0]               
____________________________________________________________________________________________________
conv2d_9 (Conv2D)                (None, None, None, 64 12288       max_pooling2d_2[0][0]            
____________________________________________________________________________________________________
batch_normalization_9 (BatchNorm (None, None, None, 64 192         conv2d_9[0][0]                   
____________________________________________________________________________________________________
activation_9 (Activation)        (None, None, None, 64 0           batch_normalization_9[0][0]      
____________________________________________________________________________________________________
conv2d_7 (Conv2D)                (None, None, None, 48 9216        max_pooling2d_2[0][0]            
____________________________________________________________________________________________________
conv2d_10 (Conv2D)               (None, None, None, 96 55296       activation_9[0][0]               
____________________________________________________________________________________________________
batch_normalization_7 (BatchNorm (None, None, None, 48 144         conv2d_7[0][0]                   
____________________________________________________________________________________________________
batch_normalization_10 (BatchNor (None, None, None, 96 288         conv2d_10[0][0]                  
____________________________________________________________________________________________________
activation_7 (Activation)        (None, None, None, 48 0           batch_normalization_7[0][0]      
____________________________________________________________________________________________________
activation_10 (Activation)       (None, None, None, 96 0           batch_normalization_10[0][0]     
____________________________________________________________________________________________________
average_pooling2d_1 (AveragePool (None, None, None, 19 0           max_pooling2d_2[0][0]            
____________________________________________________________________________________________________
conv2d_6 (Conv2D)                (None, None, None, 64 12288       max_pooling2d_2[0][0]            
____________________________________________________________________________________________________
conv2d_8 (Conv2D)                (None, None, None, 64 76800       activation_7[0][0]               
____________________________________________________________________________________________________
conv2d_11 (Conv2D)               (None, None, None, 96 82944       activation_10[0][0]              
____________________________________________________________________________________________________
conv2d_12 (Conv2D)               (None, None, None, 32 6144        average_pooling2d_1[0][0]        
____________________________________________________________________________________________________
batch_normalization_6 (BatchNorm (None, None, None, 64 192         conv2d_6[0][0]                   
____________________________________________________________________________________________________
batch_normalization_8 (BatchNorm (None, None, None, 64 192         conv2d_8[0][0]                   
____________________________________________________________________________________________________
batch_normalization_11 (BatchNor (None, None, None, 96 288         conv2d_11[0][0]                  
____________________________________________________________________________________________________
batch_normalization_12 (BatchNor (None, None, None, 32 96          conv2d_12[0][0]                  
____________________________________________________________________________________________________
activation_6 (Activation)        (None, None, None, 64 0           batch_normalization_6[0][0]      
____________________________________________________________________________________________________
activation_8 (Activation)        (None, None, None, 64 0           batch_normalization_8[0][0]      
____________________________________________________________________________________________________
activation_11 (Activation)       (None, None, None, 96 0           batch_normalization_11[0][0]     
____________________________________________________________________________________________________
activation_12 (Activation)       (None, None, None, 32 0           batch_normalization_12[0][0]     
____________________________________________________________________________________________________
mixed0 (Concatenate)             (None, None, None, 25 0           activation_6[0][0]               
                                                                   activation_8[0][0]               
                                                                   activation_11[0][0]              
                                                                   activation_12[0][0]              
____________________________________________________________________________________________________
conv2d_16 (Conv2D)               (None, None, None, 64 16384       mixed0[0][0]                     
____________________________________________________________________________________________________
batch_normalization_16 (BatchNor (None, None, None, 64 192         conv2d_16[0][0]                  
____________________________________________________________________________________________________
activation_16 (Activation)       (None, None, None, 64 0           batch_normalization_16[0][0]     
____________________________________________________________________________________________________
conv2d_14 (Conv2D)               (None, None, None, 48 12288       mixed0[0][0]                     
____________________________________________________________________________________________________
conv2d_17 (Conv2D)               (None, None, None, 96 55296       activation_16[0][0]              
____________________________________________________________________________________________________
batch_normalization_14 (BatchNor (None, None, None, 48 144         conv2d_14[0][0]                  
____________________________________________________________________________________________________
batch_normalization_17 (BatchNor (None, None, None, 96 288         conv2d_17[0][0]                  
____________________________________________________________________________________________________
activation_14 (Activation)       (None, None, None, 48 0           batch_normalization_14[0][0]     
____________________________________________________________________________________________________
activation_17 (Activation)       (None, None, None, 96 0           batch_normalization_17[0][0]     
____________________________________________________________________________________________________
average_pooling2d_2 (AveragePool (None, None, None, 25 0           mixed0[0][0]                     
____________________________________________________________________________________________________
conv2d_13 (Conv2D)               (None, None, None, 64 16384       mixed0[0][0]                     
____________________________________________________________________________________________________
conv2d_15 (Conv2D)               (None, None, None, 64 76800       activation_14[0][0]              
____________________________________________________________________________________________________
conv2d_18 (Conv2D)               (None, None, None, 96 82944       activation_17[0][0]              
____________________________________________________________________________________________________
conv2d_19 (Conv2D)               (None, None, None, 64 16384       average_pooling2d_2[0][0]        
____________________________________________________________________________________________________
batch_normalization_13 (BatchNor (None, None, None, 64 192         conv2d_13[0][0]                  
____________________________________________________________________________________________________
batch_normalization_15 (BatchNor (None, None, None, 64 192         conv2d_15[0][0]                  
____________________________________________________________________________________________________
batch_normalization_18 (BatchNor (None, None, None, 96 288         conv2d_18[0][0]                  
____________________________________________________________________________________________________
batch_normalization_19 (BatchNor (None, None, None, 64 192         conv2d_19[0][0]                  
____________________________________________________________________________________________________
activation_13 (Activation)       (None, None, None, 64 0           batch_normalization_13[0][0]     
____________________________________________________________________________________________________
activation_15 (Activation)       (None, None, None, 64 0           batch_normalization_15[0][0]     
____________________________________________________________________________________________________
activation_18 (Activation)       (None, None, None, 96 0           batch_normalization_18[0][0]     
____________________________________________________________________________________________________
activation_19 (Activation)       (None, None, None, 64 0           batch_normalization_19[0][0]     
____________________________________________________________________________________________________
mixed1 (Concatenate)             (None, None, None, 28 0           activation_13[0][0]              
                                                                   activation_15[0][0]              
                                                                   activation_18[0][0]              
                                                                   activation_19[0][0]              
____________________________________________________________________________________________________
conv2d_23 (Conv2D)               (None, None, None, 64 18432       mixed1[0][0]                     
____________________________________________________________________________________________________
batch_normalization_23 (BatchNor (None, None, None, 64 192         conv2d_23[0][0]                  
____________________________________________________________________________________________________
activation_23 (Activation)       (None, None, None, 64 0           batch_normalization_23[0][0]     
____________________________________________________________________________________________________
conv2d_21 (Conv2D)               (None, None, None, 48 13824       mixed1[0][0]                     
____________________________________________________________________________________________________
conv2d_24 (Conv2D)               (None, None, None, 96 55296       activation_23[0][0]              
____________________________________________________________________________________________________
batch_normalization_21 (BatchNor (None, None, None, 48 144         conv2d_21[0][0]                  
____________________________________________________________________________________________________
batch_normalization_24 (BatchNor (None, None, None, 96 288         conv2d_24[0][0]                  
____________________________________________________________________________________________________
activation_21 (Activation)       (None, None, None, 48 0           batch_normalization_21[0][0]     
____________________________________________________________________________________________________
activation_24 (Activation)       (None, None, None, 96 0           batch_normalization_24[0][0]     
____________________________________________________________________________________________________
average_pooling2d_3 (AveragePool (None, None, None, 28 0           mixed1[0][0]                     
____________________________________________________________________________________________________
conv2d_20 (Conv2D)               (None, None, None, 64 18432       mixed1[0][0]                     
____________________________________________________________________________________________________
conv2d_22 (Conv2D)               (None, None, None, 64 76800       activation_21[0][0]              
____________________________________________________________________________________________________
conv2d_25 (Conv2D)               (None, None, None, 96 82944       activation_24[0][0]              
____________________________________________________________________________________________________
conv2d_26 (Conv2D)               (None, None, None, 64 18432       average_pooling2d_3[0][0]        
____________________________________________________________________________________________________
batch_normalization_20 (BatchNor (None, None, None, 64 192         conv2d_20[0][0]                  
____________________________________________________________________________________________________
batch_normalization_22 (BatchNor (None, None, None, 64 192         conv2d_22[0][0]                  
____________________________________________________________________________________________________
batch_normalization_25 (BatchNor (None, None, None, 96 288         conv2d_25[0][0]                  
____________________________________________________________________________________________________
batch_normalization_26 (BatchNor (None, None, None, 64 192         conv2d_26[0][0]                  
____________________________________________________________________________________________________
activation_20 (Activation)       (None, None, None, 64 0           batch_normalization_20[0][0]     
____________________________________________________________________________________________________
activation_22 (Activation)       (None, None, None, 64 0           batch_normalization_22[0][0]     
____________________________________________________________________________________________________
activation_25 (Activation)       (None, None, None, 96 0           batch_normalization_25[0][0]     
____________________________________________________________________________________________________
activation_26 (Activation)       (None, None, None, 64 0           batch_normalization_26[0][0]     
____________________________________________________________________________________________________
mixed2 (Concatenate)             (None, None, None, 28 0           activation_20[0][0]              
                                                                   activation_22[0][0]              
                                                                   activation_25[0][0]              
                                                                   activation_26[0][0]              
____________________________________________________________________________________________________
conv2d_28 (Conv2D)               (None, None, None, 64 18432       mixed2[0][0]                     
____________________________________________________________________________________________________
batch_normalization_28 (BatchNor (None, None, None, 64 192         conv2d_28[0][0]                  
____________________________________________________________________________________________________
activation_28 (Activation)       (None, None, None, 64 0           batch_normalization_28[0][0]     
____________________________________________________________________________________________________
conv2d_29 (Conv2D)               (None, None, None, 96 55296       activation_28[0][0]              
____________________________________________________________________________________________________
batch_normalization_29 (BatchNor (None, None, None, 96 288         conv2d_29[0][0]                  
____________________________________________________________________________________________________
activation_29 (Activation)       (None, None, None, 96 0           batch_normalization_29[0][0]     
____________________________________________________________________________________________________
conv2d_27 (Conv2D)               (None, None, None, 38 995328      mixed2[0][0]                     
____________________________________________________________________________________________________
conv2d_30 (Conv2D)               (None, None, None, 96 82944       activation_29[0][0]              
____________________________________________________________________________________________________
batch_normalization_27 (BatchNor (None, None, None, 38 1152        conv2d_27[0][0]                  
____________________________________________________________________________________________________
batch_normalization_30 (BatchNor (None, None, None, 96 288         conv2d_30[0][0]                  
____________________________________________________________________________________________________
activation_27 (Activation)       (None, None, None, 38 0           batch_normalization_27[0][0]     
____________________________________________________________________________________________________
activation_30 (Activation)       (None, None, None, 96 0           batch_normalization_30[0][0]     
____________________________________________________________________________________________________
max_pooling2d_3 (MaxPooling2D)   (None, None, None, 28 0           mixed2[0][0]                     
____________________________________________________________________________________________________
mixed3 (Concatenate)             (None, None, None, 76 0           activation_27[0][0]              
                                                                   activation_30[0][0]              
                                                                   max_pooling2d_3[0][0]            
____________________________________________________________________________________________________
conv2d_35 (Conv2D)               (None, None, None, 12 98304       mixed3[0][0]                     
____________________________________________________________________________________________________
batch_normalization_35 (BatchNor (None, None, None, 12 384         conv2d_35[0][0]                  
____________________________________________________________________________________________________
activation_35 (Activation)       (None, None, None, 12 0           batch_normalization_35[0][0]     
____________________________________________________________________________________________________
conv2d_36 (Conv2D)               (None, None, None, 12 114688      activation_35[0][0]              
____________________________________________________________________________________________________
batch_normalization_36 (BatchNor (None, None, None, 12 384         conv2d_36[0][0]                  
____________________________________________________________________________________________________
activation_36 (Activation)       (None, None, None, 12 0           batch_normalization_36[0][0]     
____________________________________________________________________________________________________
conv2d_32 (Conv2D)               (None, None, None, 12 98304       mixed3[0][0]                     
____________________________________________________________________________________________________
conv2d_37 (Conv2D)               (None, None, None, 12 114688      activation_36[0][0]              
____________________________________________________________________________________________________
batch_normalization_32 (BatchNor (None, None, None, 12 384         conv2d_32[0][0]                  
____________________________________________________________________________________________________
batch_normalization_37 (BatchNor (None, None, None, 12 384         conv2d_37[0][0]                  
____________________________________________________________________________________________________
activation_32 (Activation)       (None, None, None, 12 0           batch_normalization_32[0][0]     
____________________________________________________________________________________________________
activation_37 (Activation)       (None, None, None, 12 0           batch_normalization_37[0][0]     
____________________________________________________________________________________________________
conv2d_33 (Conv2D)               (None, None, None, 12 114688      activation_32[0][0]              
____________________________________________________________________________________________________
conv2d_38 (Conv2D)               (None, None, None, 12 114688      activation_37[0][0]              
____________________________________________________________________________________________________
batch_normalization_33 (BatchNor (None, None, None, 12 384         conv2d_33[0][0]                  
____________________________________________________________________________________________________
batch_normalization_38 (BatchNor (None, None, None, 12 384         conv2d_38[0][0]                  
____________________________________________________________________________________________________
activation_33 (Activation)       (None, None, None, 12 0           batch_normalization_33[0][0]     
____________________________________________________________________________________________________
activation_38 (Activation)       (None, None, None, 12 0           batch_normalization_38[0][0]     
____________________________________________________________________________________________________
average_pooling2d_4 (AveragePool (None, None, None, 76 0           mixed3[0][0]                     
____________________________________________________________________________________________________
conv2d_31 (Conv2D)               (None, None, None, 19 147456      mixed3[0][0]                     
____________________________________________________________________________________________________
conv2d_34 (Conv2D)               (None, None, None, 19 172032      activation_33[0][0]              
____________________________________________________________________________________________________
conv2d_39 (Conv2D)               (None, None, None, 19 172032      activation_38[0][0]              
____________________________________________________________________________________________________
conv2d_40 (Conv2D)               (None, None, None, 19 147456      average_pooling2d_4[0][0]        
____________________________________________________________________________________________________
batch_normalization_31 (BatchNor (None, None, None, 19 576         conv2d_31[0][0]                  
____________________________________________________________________________________________________
batch_normalization_34 (BatchNor (None, None, None, 19 576         conv2d_34[0][0]                  
____________________________________________________________________________________________________
batch_normalization_39 (BatchNor (None, None, None, 19 576         conv2d_39[0][0]                  
____________________________________________________________________________________________________
batch_normalization_40 (BatchNor (None, None, None, 19 576         conv2d_40[0][0]                  
____________________________________________________________________________________________________
activation_31 (Activation)       (None, None, None, 19 0           batch_normalization_31[0][0]     
____________________________________________________________________________________________________
activation_34 (Activation)       (None, None, None, 19 0           batch_normalization_34[0][0]     
____________________________________________________________________________________________________
activation_39 (Activation)       (None, None, None, 19 0           batch_normalization_39[0][0]     
____________________________________________________________________________________________________
activation_40 (Activation)       (None, None, None, 19 0           batch_normalization_40[0][0]     
____________________________________________________________________________________________________
mixed4 (Concatenate)             (None, None, None, 76 0           activation_31[0][0]              
                                                                   activation_34[0][0]              
                                                                   activation_39[0][0]              
                                                                   activation_40[0][0]              
____________________________________________________________________________________________________
conv2d_45 (Conv2D)               (None, None, None, 16 122880      mixed4[0][0]                     
____________________________________________________________________________________________________
batch_normalization_45 (BatchNor (None, None, None, 16 480         conv2d_45[0][0]                  
____________________________________________________________________________________________________
activation_45 (Activation)       (None, None, None, 16 0           batch_normalization_45[0][0]     
____________________________________________________________________________________________________
conv2d_46 (Conv2D)               (None, None, None, 16 179200      activation_45[0][0]              
____________________________________________________________________________________________________
batch_normalization_46 (BatchNor (None, None, None, 16 480         conv2d_46[0][0]                  
____________________________________________________________________________________________________
activation_46 (Activation)       (None, None, None, 16 0           batch_normalization_46[0][0]     
____________________________________________________________________________________________________
conv2d_42 (Conv2D)               (None, None, None, 16 122880      mixed4[0][0]                     
____________________________________________________________________________________________________
conv2d_47 (Conv2D)               (None, None, None, 16 179200      activation_46[0][0]              
____________________________________________________________________________________________________
batch_normalization_42 (BatchNor (None, None, None, 16 480         conv2d_42[0][0]                  
____________________________________________________________________________________________________
batch_normalization_47 (BatchNor (None, None, None, 16 480         conv2d_47[0][0]                  
____________________________________________________________________________________________________
activation_42 (Activation)       (None, None, None, 16 0           batch_normalization_42[0][0]     
____________________________________________________________________________________________________
activation_47 (Activation)       (None, None, None, 16 0           batch_normalization_47[0][0]     
____________________________________________________________________________________________________
conv2d_43 (Conv2D)               (None, None, None, 16 179200      activation_42[0][0]              
____________________________________________________________________________________________________
conv2d_48 (Conv2D)               (None, None, None, 16 179200      activation_47[0][0]              
____________________________________________________________________________________________________
batch_normalization_43 (BatchNor (None, None, None, 16 480         conv2d_43[0][0]                  
____________________________________________________________________________________________________
batch_normalization_48 (BatchNor (None, None, None, 16 480         conv2d_48[0][0]                  
____________________________________________________________________________________________________
activation_43 (Activation)       (None, None, None, 16 0           batch_normalization_43[0][0]     
____________________________________________________________________________________________________
activation_48 (Activation)       (None, None, None, 16 0           batch_normalization_48[0][0]     
____________________________________________________________________________________________________
average_pooling2d_5 (AveragePool (None, None, None, 76 0           mixed4[0][0]                     
____________________________________________________________________________________________________
conv2d_41 (Conv2D)               (None, None, None, 19 147456      mixed4[0][0]                     
____________________________________________________________________________________________________
conv2d_44 (Conv2D)               (None, None, None, 19 215040      activation_43[0][0]              
____________________________________________________________________________________________________
conv2d_49 (Conv2D)               (None, None, None, 19 215040      activation_48[0][0]              
____________________________________________________________________________________________________
conv2d_50 (Conv2D)               (None, None, None, 19 147456      average_pooling2d_5[0][0]        
____________________________________________________________________________________________________
batch_normalization_41 (BatchNor (None, None, None, 19 576         conv2d_41[0][0]                  
____________________________________________________________________________________________________
batch_normalization_44 (BatchNor (None, None, None, 19 576         conv2d_44[0][0]                  
____________________________________________________________________________________________________
batch_normalization_49 (BatchNor (None, None, None, 19 576         conv2d_49[0][0]                  
____________________________________________________________________________________________________
batch_normalization_50 (BatchNor (None, None, None, 19 576         conv2d_50[0][0]                  
____________________________________________________________________________________________________
activation_41 (Activation)       (None, None, None, 19 0           batch_normalization_41[0][0]     
____________________________________________________________________________________________________
activation_44 (Activation)       (None, None, None, 19 0           batch_normalization_44[0][0]     
____________________________________________________________________________________________________
activation_49 (Activation)       (None, None, None, 19 0           batch_normalization_49[0][0]     
____________________________________________________________________________________________________
activation_50 (Activation)       (None, None, None, 19 0           batch_normalization_50[0][0]     
____________________________________________________________________________________________________
mixed5 (Concatenate)             (None, None, None, 76 0           activation_41[0][0]              
                                                                   activation_44[0][0]              
                                                                   activation_49[0][0]              
                                                                   activation_50[0][0]              
____________________________________________________________________________________________________
conv2d_55 (Conv2D)               (None, None, None, 16 122880      mixed5[0][0]                     
____________________________________________________________________________________________________
batch_normalization_55 (BatchNor (None, None, None, 16 480         conv2d_55[0][0]                  
____________________________________________________________________________________________________
activation_55 (Activation)       (None, None, None, 16 0           batch_normalization_55[0][0]     
____________________________________________________________________________________________________
conv2d_56 (Conv2D)               (None, None, None, 16 179200      activation_55[0][0]              
____________________________________________________________________________________________________
batch_normalization_56 (BatchNor (None, None, None, 16 480         conv2d_56[0][0]                  
____________________________________________________________________________________________________
activation_56 (Activation)       (None, None, None, 16 0           batch_normalization_56[0][0]     
____________________________________________________________________________________________________
conv2d_52 (Conv2D)               (None, None, None, 16 122880      mixed5[0][0]                     
____________________________________________________________________________________________________
conv2d_57 (Conv2D)               (None, None, None, 16 179200      activation_56[0][0]              
____________________________________________________________________________________________________
batch_normalization_52 (BatchNor (None, None, None, 16 480         conv2d_52[0][0]                  
____________________________________________________________________________________________________
batch_normalization_57 (BatchNor (None, None, None, 16 480         conv2d_57[0][0]                  
____________________________________________________________________________________________________
activation_52 (Activation)       (None, None, None, 16 0           batch_normalization_52[0][0]     
____________________________________________________________________________________________________
activation_57 (Activation)       (None, None, None, 16 0           batch_normalization_57[0][0]     
____________________________________________________________________________________________________
conv2d_53 (Conv2D)               (None, None, None, 16 179200      activation_52[0][0]              
____________________________________________________________________________________________________
conv2d_58 (Conv2D)               (None, None, None, 16 179200      activation_57[0][0]              
____________________________________________________________________________________________________
batch_normalization_53 (BatchNor (None, None, None, 16 480         conv2d_53[0][0]                  
____________________________________________________________________________________________________
batch_normalization_58 (BatchNor (None, None, None, 16 480         conv2d_58[0][0]                  
____________________________________________________________________________________________________
activation_53 (Activation)       (None, None, None, 16 0           batch_normalization_53[0][0]     
____________________________________________________________________________________________________
activation_58 (Activation)       (None, None, None, 16 0           batch_normalization_58[0][0]     
____________________________________________________________________________________________________
average_pooling2d_6 (AveragePool (None, None, None, 76 0           mixed5[0][0]                     
____________________________________________________________________________________________________
conv2d_51 (Conv2D)               (None, None, None, 19 147456      mixed5[0][0]                     
____________________________________________________________________________________________________
conv2d_54 (Conv2D)               (None, None, None, 19 215040      activation_53[0][0]              
____________________________________________________________________________________________________
conv2d_59 (Conv2D)               (None, None, None, 19 215040      activation_58[0][0]              
____________________________________________________________________________________________________
conv2d_60 (Conv2D)               (None, None, None, 19 147456      average_pooling2d_6[0][0]        
____________________________________________________________________________________________________
batch_normalization_51 (BatchNor (None, None, None, 19 576         conv2d_51[0][0]                  
____________________________________________________________________________________________________
batch_normalization_54 (BatchNor (None, None, None, 19 576         conv2d_54[0][0]                  
____________________________________________________________________________________________________
batch_normalization_59 (BatchNor (None, None, None, 19 576         conv2d_59[0][0]                  
____________________________________________________________________________________________________
batch_normalization_60 (BatchNor (None, None, None, 19 576         conv2d_60[0][0]                  
____________________________________________________________________________________________________
activation_51 (Activation)       (None, None, None, 19 0           batch_normalization_51[0][0]     
____________________________________________________________________________________________________
activation_54 (Activation)       (None, None, None, 19 0           batch_normalization_54[0][0]     
____________________________________________________________________________________________________
activation_59 (Activation)       (None, None, None, 19 0           batch_normalization_59[0][0]     
____________________________________________________________________________________________________
activation_60 (Activation)       (None, None, None, 19 0           batch_normalization_60[0][0]     
____________________________________________________________________________________________________
mixed6 (Concatenate)             (None, None, None, 76 0           activation_51[0][0]              
                                                                   activation_54[0][0]              
                                                                   activation_59[0][0]              
                                                                   activation_60[0][0]              
____________________________________________________________________________________________________
conv2d_65 (Conv2D)               (None, None, None, 19 147456      mixed6[0][0]                     
____________________________________________________________________________________________________
batch_normalization_65 (BatchNor (None, None, None, 19 576         conv2d_65[0][0]                  
____________________________________________________________________________________________________
activation_65 (Activation)       (None, None, None, 19 0           batch_normalization_65[0][0]     
____________________________________________________________________________________________________
conv2d_66 (Conv2D)               (None, None, None, 19 258048      activation_65[0][0]              
____________________________________________________________________________________________________
batch_normalization_66 (BatchNor (None, None, None, 19 576         conv2d_66[0][0]                  
____________________________________________________________________________________________________
activation_66 (Activation)       (None, None, None, 19 0           batch_normalization_66[0][0]     
____________________________________________________________________________________________________
conv2d_62 (Conv2D)               (None, None, None, 19 147456      mixed6[0][0]                     
____________________________________________________________________________________________________
conv2d_67 (Conv2D)               (None, None, None, 19 258048      activation_66[0][0]              
____________________________________________________________________________________________________
batch_normalization_62 (BatchNor (None, None, None, 19 576         conv2d_62[0][0]                  
____________________________________________________________________________________________________
batch_normalization_67 (BatchNor (None, None, None, 19 576         conv2d_67[0][0]                  
____________________________________________________________________________________________________
activation_62 (Activation)       (None, None, None, 19 0           batch_normalization_62[0][0]     
____________________________________________________________________________________________________
activation_67 (Activation)       (None, None, None, 19 0           batch_normalization_67[0][0]     
____________________________________________________________________________________________________
conv2d_63 (Conv2D)               (None, None, None, 19 258048      activation_62[0][0]              
____________________________________________________________________________________________________
conv2d_68 (Conv2D)               (None, None, None, 19 258048      activation_67[0][0]              
____________________________________________________________________________________________________
batch_normalization_63 (BatchNor (None, None, None, 19 576         conv2d_63[0][0]                  
____________________________________________________________________________________________________
batch_normalization_68 (BatchNor (None, None, None, 19 576         conv2d_68[0][0]                  
____________________________________________________________________________________________________
activation_63 (Activation)       (None, None, None, 19 0           batch_normalization_63[0][0]     
____________________________________________________________________________________________________
activation_68 (Activation)       (None, None, None, 19 0           batch_normalization_68[0][0]     
____________________________________________________________________________________________________
average_pooling2d_7 (AveragePool (None, None, None, 76 0           mixed6[0][0]                     
____________________________________________________________________________________________________
conv2d_61 (Conv2D)               (None, None, None, 19 147456      mixed6[0][0]                     
____________________________________________________________________________________________________
conv2d_64 (Conv2D)               (None, None, None, 19 258048      activation_63[0][0]              
____________________________________________________________________________________________________
conv2d_69 (Conv2D)               (None, None, None, 19 258048      activation_68[0][0]              
____________________________________________________________________________________________________
conv2d_70 (Conv2D)               (None, None, None, 19 147456      average_pooling2d_7[0][0]        
____________________________________________________________________________________________________
batch_normalization_61 (BatchNor (None, None, None, 19 576         conv2d_61[0][0]                  
____________________________________________________________________________________________________
batch_normalization_64 (BatchNor (None, None, None, 19 576         conv2d_64[0][0]                  
____________________________________________________________________________________________________
batch_normalization_69 (BatchNor (None, None, None, 19 576         conv2d_69[0][0]                  
____________________________________________________________________________________________________
batch_normalization_70 (BatchNor (None, None, None, 19 576         conv2d_70[0][0]                  
____________________________________________________________________________________________________
activation_61 (Activation)       (None, None, None, 19 0           batch_normalization_61[0][0]     
____________________________________________________________________________________________________
activation_64 (Activation)       (None, None, None, 19 0           batch_normalization_64[0][0]     
____________________________________________________________________________________________________
activation_69 (Activation)       (None, None, None, 19 0           batch_normalization_69[0][0]     
____________________________________________________________________________________________________
activation_70 (Activation)       (None, None, None, 19 0           batch_normalization_70[0][0]     
____________________________________________________________________________________________________
mixed7 (Concatenate)             (None, None, None, 76 0           activation_61[0][0]              
                                                                   activation_64[0][0]              
                                                                   activation_69[0][0]              
                                                                   activation_70[0][0]              
____________________________________________________________________________________________________
conv2d_73 (Conv2D)               (None, None, None, 19 147456      mixed7[0][0]                     
____________________________________________________________________________________________________
batch_normalization_73 (BatchNor (None, None, None, 19 576         conv2d_73[0][0]                  
____________________________________________________________________________________________________
activation_73 (Activation)       (None, None, None, 19 0           batch_normalization_73[0][0]     
____________________________________________________________________________________________________
conv2d_74 (Conv2D)               (None, None, None, 19 258048      activation_73[0][0]              
____________________________________________________________________________________________________
batch_normalization_74 (BatchNor (None, None, None, 19 576         conv2d_74[0][0]                  
____________________________________________________________________________________________________
activation_74 (Activation)       (None, None, None, 19 0           batch_normalization_74[0][0]     
____________________________________________________________________________________________________
conv2d_71 (Conv2D)               (None, None, None, 19 147456      mixed7[0][0]                     
____________________________________________________________________________________________________
conv2d_75 (Conv2D)               (None, None, None, 19 258048      activation_74[0][0]              
____________________________________________________________________________________________________
batch_normalization_71 (BatchNor (None, None, None, 19 576         conv2d_71[0][0]                  
____________________________________________________________________________________________________
batch_normalization_75 (BatchNor (None, None, None, 19 576         conv2d_75[0][0]                  
____________________________________________________________________________________________________
activation_71 (Activation)       (None, None, None, 19 0           batch_normalization_71[0][0]     
____________________________________________________________________________________________________
activation_75 (Activation)       (None, None, None, 19 0           batch_normalization_75[0][0]     
____________________________________________________________________________________________________
conv2d_72 (Conv2D)               (None, None, None, 32 552960      activation_71[0][0]              
____________________________________________________________________________________________________
conv2d_76 (Conv2D)               (None, None, None, 19 331776      activation_75[0][0]              
____________________________________________________________________________________________________
batch_normalization_72 (BatchNor (None, None, None, 32 960         conv2d_72[0][0]                  
____________________________________________________________________________________________________
batch_normalization_76 (BatchNor (None, None, None, 19 576         conv2d_76[0][0]                  
____________________________________________________________________________________________________
activation_72 (Activation)       (None, None, None, 32 0           batch_normalization_72[0][0]     
____________________________________________________________________________________________________
activation_76 (Activation)       (None, None, None, 19 0           batch_normalization_76[0][0]     
____________________________________________________________________________________________________
max_pooling2d_4 (MaxPooling2D)   (None, None, None, 76 0           mixed7[0][0]                     
____________________________________________________________________________________________________
mixed8 (Concatenate)             (None, None, None, 12 0           activation_72[0][0]              
                                                                   activation_76[0][0]              
                                                                   max_pooling2d_4[0][0]            
____________________________________________________________________________________________________
conv2d_81 (Conv2D)               (None, None, None, 44 573440      mixed8[0][0]                     
____________________________________________________________________________________________________
batch_normalization_81 (BatchNor (None, None, None, 44 1344        conv2d_81[0][0]                  
____________________________________________________________________________________________________
activation_81 (Activation)       (None, None, None, 44 0           batch_normalization_81[0][0]     
____________________________________________________________________________________________________
conv2d_78 (Conv2D)               (None, None, None, 38 491520      mixed8[0][0]                     
____________________________________________________________________________________________________
conv2d_82 (Conv2D)               (None, None, None, 38 1548288     activation_81[0][0]              
____________________________________________________________________________________________________
batch_normalization_78 (BatchNor (None, None, None, 38 1152        conv2d_78[0][0]                  
____________________________________________________________________________________________________
batch_normalization_82 (BatchNor (None, None, None, 38 1152        conv2d_82[0][0]                  
____________________________________________________________________________________________________
activation_78 (Activation)       (None, None, None, 38 0           batch_normalization_78[0][0]     
____________________________________________________________________________________________________
activation_82 (Activation)       (None, None, None, 38 0           batch_normalization_82[0][0]     
____________________________________________________________________________________________________
conv2d_79 (Conv2D)               (None, None, None, 38 442368      activation_78[0][0]              
____________________________________________________________________________________________________
conv2d_80 (Conv2D)               (None, None, None, 38 442368      activation_78[0][0]              
____________________________________________________________________________________________________
conv2d_83 (Conv2D)               (None, None, None, 38 442368      activation_82[0][0]              
____________________________________________________________________________________________________
conv2d_84 (Conv2D)               (None, None, None, 38 442368      activation_82[0][0]              
____________________________________________________________________________________________________
average_pooling2d_8 (AveragePool (None, None, None, 12 0           mixed8[0][0]                     
____________________________________________________________________________________________________
conv2d_77 (Conv2D)               (None, None, None, 32 409600      mixed8[0][0]                     
____________________________________________________________________________________________________
batch_normalization_79 (BatchNor (None, None, None, 38 1152        conv2d_79[0][0]                  
____________________________________________________________________________________________________
batch_normalization_80 (BatchNor (None, None, None, 38 1152        conv2d_80[0][0]                  
____________________________________________________________________________________________________
batch_normalization_83 (BatchNor (None, None, None, 38 1152        conv2d_83[0][0]                  
____________________________________________________________________________________________________
batch_normalization_84 (BatchNor (None, None, None, 38 1152        conv2d_84[0][0]                  
____________________________________________________________________________________________________
conv2d_85 (Conv2D)               (None, None, None, 19 245760      average_pooling2d_8[0][0]        
____________________________________________________________________________________________________
batch_normalization_77 (BatchNor (None, None, None, 32 960         conv2d_77[0][0]                  
____________________________________________________________________________________________________
activation_79 (Activation)       (None, None, None, 38 0           batch_normalization_79[0][0]     
____________________________________________________________________________________________________
activation_80 (Activation)       (None, None, None, 38 0           batch_normalization_80[0][0]     
____________________________________________________________________________________________________
activation_83 (Activation)       (None, None, None, 38 0           batch_normalization_83[0][0]     
____________________________________________________________________________________________________
activation_84 (Activation)       (None, None, None, 38 0           batch_normalization_84[0][0]     
____________________________________________________________________________________________________
batch_normalization_85 (BatchNor (None, None, None, 19 576         conv2d_85[0][0]                  
____________________________________________________________________________________________________
activation_77 (Activation)       (None, None, None, 32 0           batch_normalization_77[0][0]     
____________________________________________________________________________________________________
mixed9_0 (Concatenate)           (None, None, None, 76 0           activation_79[0][0]              
                                                                   activation_80[0][0]              
____________________________________________________________________________________________________
concatenate_1 (Concatenate)      (None, None, None, 76 0           activation_83[0][0]              
                                                                   activation_84[0][0]              
____________________________________________________________________________________________________
activation_85 (Activation)       (None, None, None, 19 0           batch_normalization_85[0][0]     
____________________________________________________________________________________________________
mixed9 (Concatenate)             (None, None, None, 20 0           activation_77[0][0]              
                                                                   mixed9_0[0][0]                   
                                                                   concatenate_1[0][0]              
                                                                   activation_85[0][0]              
____________________________________________________________________________________________________
conv2d_90 (Conv2D)               (None, None, None, 44 917504      mixed9[0][0]                     
____________________________________________________________________________________________________
batch_normalization_90 (BatchNor (None, None, None, 44 1344        conv2d_90[0][0]                  
____________________________________________________________________________________________________
activation_90 (Activation)       (None, None, None, 44 0           batch_normalization_90[0][0]     
____________________________________________________________________________________________________
conv2d_87 (Conv2D)               (None, None, None, 38 786432      mixed9[0][0]                     
____________________________________________________________________________________________________
conv2d_91 (Conv2D)               (None, None, None, 38 1548288     activation_90[0][0]              
____________________________________________________________________________________________________
batch_normalization_87 (BatchNor (None, None, None, 38 1152        conv2d_87[0][0]                  
____________________________________________________________________________________________________
batch_normalization_91 (BatchNor (None, None, None, 38 1152        conv2d_91[0][0]                  
____________________________________________________________________________________________________
activation_87 (Activation)       (None, None, None, 38 0           batch_normalization_87[0][0]     
____________________________________________________________________________________________________
activation_91 (Activation)       (None, None, None, 38 0           batch_normalization_91[0][0]     
____________________________________________________________________________________________________
conv2d_88 (Conv2D)               (None, None, None, 38 442368      activation_87[0][0]              
____________________________________________________________________________________________________
conv2d_89 (Conv2D)               (None, None, None, 38 442368      activation_87[0][0]              
____________________________________________________________________________________________________
conv2d_92 (Conv2D)               (None, None, None, 38 442368      activation_91[0][0]              
____________________________________________________________________________________________________
conv2d_93 (Conv2D)               (None, None, None, 38 442368      activation_91[0][0]              
____________________________________________________________________________________________________
average_pooling2d_9 (AveragePool (None, None, None, 20 0           mixed9[0][0]                     
____________________________________________________________________________________________________
conv2d_86 (Conv2D)               (None, None, None, 32 655360      mixed9[0][0]                     
____________________________________________________________________________________________________
batch_normalization_88 (BatchNor (None, None, None, 38 1152        conv2d_88[0][0]                  
____________________________________________________________________________________________________
batch_normalization_89 (BatchNor (None, None, None, 38 1152        conv2d_89[0][0]                  
____________________________________________________________________________________________________
batch_normalization_92 (BatchNor (None, None, None, 38 1152        conv2d_92[0][0]                  
____________________________________________________________________________________________________
batch_normalization_93 (BatchNor (None, None, None, 38 1152        conv2d_93[0][0]                  
____________________________________________________________________________________________________
conv2d_94 (Conv2D)               (None, None, None, 19 393216      average_pooling2d_9[0][0]        
____________________________________________________________________________________________________
batch_normalization_86 (BatchNor (None, None, None, 32 960         conv2d_86[0][0]                  
____________________________________________________________________________________________________
activation_88 (Activation)       (None, None, None, 38 0           batch_normalization_88[0][0]     
____________________________________________________________________________________________________
activation_89 (Activation)       (None, None, None, 38 0           batch_normalization_89[0][0]     
____________________________________________________________________________________________________
activation_92 (Activation)       (None, None, None, 38 0           batch_normalization_92[0][0]     
____________________________________________________________________________________________________
activation_93 (Activation)       (None, None, None, 38 0           batch_normalization_93[0][0]     
____________________________________________________________________________________________________
batch_normalization_94 (BatchNor (None, None, None, 19 576         conv2d_94[0][0]                  
____________________________________________________________________________________________________
activation_86 (Activation)       (None, None, None, 32 0           batch_normalization_86[0][0]     
____________________________________________________________________________________________________
mixed9_1 (Concatenate)           (None, None, None, 76 0           activation_88[0][0]              
                                                                   activation_89[0][0]              
____________________________________________________________________________________________________
concatenate_2 (Concatenate)      (None, None, None, 76 0           activation_92[0][0]              
                                                                   activation_93[0][0]              
____________________________________________________________________________________________________
activation_94 (Activation)       (None, None, None, 19 0           batch_normalization_94[0][0]     
____________________________________________________________________________________________________
mixed10 (Concatenate)            (None, None, None, 20 0           activation_86[0][0]              
                                                                   mixed9_1[0][0]                   
                                                                   concatenate_2[0][0]              
                                                                   activation_94[0][0]              
____________________________________________________________________________________________________
global_average_pooling2d_2 (Glob (None, 2048)          0           mixed10[0][0]                    
____________________________________________________________________________________________________
dense_5 (Dense)                  (None, 1000)          2049000     global_average_pooling2d_2[0][0] 
____________________________________________________________________________________________________
dense_6 (Dense)                  (None, 200)           200200      dense_5[0][0]                    
____________________________________________________________________________________________________
dense_7 (Dense)                  (None, 50)            10050       dense_6[0][0]                    
____________________________________________________________________________________________________
dense_8 (Dense)                  (None, 7)             357         dense_7[0][0]                    
====================================================================================================
Total params: 24,062,391
Trainable params: 2,259,607
Non-trainable params: 21,802,784
____________________________________________________________________________________________________
In [39]:
# Load the list of images and categories
train_faces, train_targets = load_dataset('./images-fewer/Train')
test_faces, test_targets = load_dataset('./images-fewer/Test')
validate_faces, validate_targets = load_dataset('./images-fewer/Validate')

face_names = [item[15:-1] for item in glob('./images-fewer/Train/*/')]

print('There are %d face categories.' % len(face_names))
print(face_names)
print('There are %d total faces.' % len(np.hstack([train_faces, test_faces, validate_faces])))
print('There are %d training faces.' % len(train_faces))
print('There are %d test faces.' % len(test_faces))
print('There are %d validate faces.' % len(validate_faces))
There are 7 face categories.
['Train\\Brother', 'Train\\Dad', 'Train\\Daughter', 'Train\\Me', 'Train\\Mum', 'Train\\Son', 'Train\\Wife']
There are 263 total faces.
There are 148 training faces.
There are 58 test faces.
There are 57 validate faces.
In [40]:
from keras.preprocessing import image

def path_to_tensor(img_path):
    img = image.load_img(img_path, target_size=(299,299))
    x = image.img_to_array(img)
    return np.expand_dims(x, axis=0)

def paths_to_tensor(img_paths):
    list_of_tensors = [path_to_tensor(img_path) for img_path in img_paths]
    return np.vstack(list_of_tensors)
In [41]:
# Read the images as numpy arrays
train_tensors = paths_to_tensor(train_faces).astype('float32')/255
test_tensors = paths_to_tensor(test_faces).astype('float32')/255
validate_tensors = paths_to_tensor(validate_faces).astype('float32')/255
print("Train tensor shape.", train_tensors.shape)
print('Test tensor shape.', test_tensors.shape)
print('Validate tensor shape.', validate_tensors.shape)
Train tensor shape. (148, 299, 299, 3)
Test tensor shape. (58, 299, 299, 3)
Validate tensor shape. (57, 299, 299, 3)
In [26]:
my_model_1.compile(optimizer='rmsprop', loss='categorical_crossentropy', metrics=['accuracy'])
checkpointer = ModelCheckpoint(filepath='my_model_1.h5',
                              verbose=1, save_best_only=True)
early_stopping = EarlyStopping(monitor='val_loss', min_delta=0, patience=30, verbose=1, mode='auto')
reduce_lr = ReduceLROnPlateau(monitor='val_loss', factor=0.9, patience=10, cooldown=0, min_lr=0.00001)
lr_print = LambdaCallback(on_epoch_begin=lambda epoch, logs: print('lr:', K.eval(my_model_1.optimizer.lr)))
In [27]:
hist_1 = my_model_1.fit(train_tensors, train_targets,
            validation_data=(validate_tensors, validate_targets),
            epochs=200, verbose=1, batch_size=20,
            callbacks=[checkpointer, early_stopping, reduce_lr, lr_print])
Train on 148 samples, validate on 57 samples
lr: 0.001
Epoch 1/200
140/148 [===========================>..] - ETA: 0s - loss: 4.5652 - acc: 0.1571Epoch 00000: val_loss improved from inf to 2.13415, saving model to my_model_1.h5
148/148 [==============================] - 7s - loss: 4.4206 - acc: 0.1554 - val_loss: 2.1342 - val_acc: 0.2105
lr: 0.001
Epoch 2/200
140/148 [===========================>..] - ETA: 0s - loss: 1.7422 - acc: 0.2643Epoch 00001: val_loss improved from 2.13415 to 1.66309, saving model to my_model_1.h5
148/148 [==============================] - 3s - loss: 1.7408 - acc: 0.2568 - val_loss: 1.6631 - val_acc: 0.3684
lr: 0.001
Epoch 3/200
140/148 [===========================>..] - ETA: 0s - loss: 1.6977 - acc: 0.3071Epoch 00002: val_loss improved from 1.66309 to 1.61623, saving model to my_model_1.h5
148/148 [==============================] - 3s - loss: 1.6821 - acc: 0.2973 - val_loss: 1.6162 - val_acc: 0.4211
lr: 0.001
Epoch 4/200
140/148 [===========================>..] - ETA: 0s - loss: 1.5571 - acc: 0.4071Epoch 00003: val_loss improved from 1.61623 to 1.55405, saving model to my_model_1.h5
148/148 [==============================] - 3s - loss: 1.5611 - acc: 0.4054 - val_loss: 1.5541 - val_acc: 0.3684
lr: 0.001
Epoch 5/200
140/148 [===========================>..] - ETA: 0s - loss: 1.6977 - acc: 0.3071Epoch 00004: val_loss improved from 1.55405 to 1.39730, saving model to my_model_1.h5
148/148 [==============================] - 3s - loss: 1.6886 - acc: 0.3041 - val_loss: 1.3973 - val_acc: 0.3333
lr: 0.001
Epoch 6/200
140/148 [===========================>..] - ETA: 0s - loss: 1.2421 - acc: 0.5071Epoch 00005: val_loss did not improve
148/148 [==============================] - 1s - loss: 1.2546 - acc: 0.4865 - val_loss: 2.9572 - val_acc: 0.1754
lr: 0.001
Epoch 7/200
140/148 [===========================>..] - ETA: 0s - loss: 1.3229 - acc: 0.4500Epoch 00006: val_loss did not improve
148/148 [==============================] - 1s - loss: 1.3060 - acc: 0.4595 - val_loss: 1.8925 - val_acc: 0.3158
lr: 0.001
Epoch 8/200
140/148 [===========================>..] - ETA: 0s - loss: 1.3546 - acc: 0.4857Epoch 00007: val_loss did not improve
148/148 [==============================] - 1s - loss: 1.3587 - acc: 0.4730 - val_loss: 1.4916 - val_acc: 0.4211
lr: 0.001
Epoch 9/200
140/148 [===========================>..] - ETA: 0s - loss: 1.3877 - acc: 0.4000Epoch 00008: val_loss did not improve
148/148 [==============================] - 1s - loss: 1.3478 - acc: 0.4189 - val_loss: 1.8474 - val_acc: 0.3684
lr: 0.001
Epoch 10/200
140/148 [===========================>..] - ETA: 0s - loss: 0.8581 - acc: 0.6357Epoch 00009: val_loss did not improve
148/148 [==============================] - 1s - loss: 1.0222 - acc: 0.6014 - val_loss: 1.9560 - val_acc: 0.3158
lr: 0.001
Epoch 11/200
140/148 [===========================>..] - ETA: 0s - loss: 0.9653 - acc: 0.6571Epoch 00010: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.9761 - acc: 0.6419 - val_loss: 1.8341 - val_acc: 0.3158
lr: 0.001
Epoch 12/200
140/148 [===========================>..] - ETA: 0s - loss: 0.7858 - acc: 0.6714Epoch 00011: val_loss improved from 1.39730 to 1.38855, saving model to my_model_1.h5
148/148 [==============================] - 3s - loss: 0.8674 - acc: 0.6486 - val_loss: 1.3885 - val_acc: 0.3860
lr: 0.001
Epoch 13/200
140/148 [===========================>..] - ETA: 0s - loss: 0.8630 - acc: 0.6857Epoch 00012: val_loss improved from 1.38855 to 1.12812, saving model to my_model_1.h5
148/148 [==============================] - 2s - loss: 0.8406 - acc: 0.7027 - val_loss: 1.1281 - val_acc: 0.5263
lr: 0.001
Epoch 14/200
140/148 [===========================>..] - ETA: 0s - loss: 1.0578 - acc: 0.6500Epoch 00013: val_loss did not improve
148/148 [==============================] - 1s - loss: 1.0664 - acc: 0.6419 - val_loss: 1.4853 - val_acc: 0.3333
lr: 0.001
Epoch 15/200
140/148 [===========================>..] - ETA: 0s - loss: 0.7282 - acc: 0.6714Epoch 00014: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.8352 - acc: 0.6622 - val_loss: 1.2791 - val_acc: 0.5088
lr: 0.001
Epoch 16/200
140/148 [===========================>..] - ETA: 0s - loss: 0.4892 - acc: 0.8357Epoch 00015: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.4900 - acc: 0.8311 - val_loss: 1.3549 - val_acc: 0.5088
lr: 0.001
Epoch 17/200
140/148 [===========================>..] - ETA: 0s - loss: 0.5659 - acc: 0.7357Epoch 00016: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.6674 - acc: 0.7162 - val_loss: 1.6742 - val_acc: 0.3684
lr: 0.001
Epoch 18/200
140/148 [===========================>..] - ETA: 0s - loss: 0.4760 - acc: 0.8571Epoch 00017: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.4665 - acc: 0.8649 - val_loss: 1.1719 - val_acc: 0.5789
lr: 0.001
Epoch 19/200
140/148 [===========================>..] - ETA: 0s - loss: 0.4896 - acc: 0.8071Epoch 00018: val_loss improved from 1.12812 to 1.03066, saving model to my_model_1.h5
148/148 [==============================] - 2s - loss: 0.4699 - acc: 0.8176 - val_loss: 1.0307 - val_acc: 0.6140
lr: 0.001
Epoch 20/200
140/148 [===========================>..] - ETA: 0s - loss: 0.4586 - acc: 0.7857Epoch 00019: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.4524 - acc: 0.7838 - val_loss: 1.1686 - val_acc: 0.5614
lr: 0.001
Epoch 21/200
140/148 [===========================>..] - ETA: 0s - loss: 0.9808 - acc: 0.6857Epoch 00020: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.9415 - acc: 0.6959 - val_loss: 1.0490 - val_acc: 0.6491
lr: 0.001
Epoch 22/200
140/148 [===========================>..] - ETA: 0s - loss: 0.2343 - acc: 0.9143Epoch 00021: val_loss improved from 1.03066 to 0.95162, saving model to my_model_1.h5
148/148 [==============================] - 2s - loss: 0.2362 - acc: 0.9122 - val_loss: 0.9516 - val_acc: 0.6316
lr: 0.001
Epoch 23/200
140/148 [===========================>..] - ETA: 0s - loss: 0.5188 - acc: 0.7929Epoch 00022: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.5239 - acc: 0.7905 - val_loss: 1.1117 - val_acc: 0.5439
lr: 0.001
Epoch 24/200
140/148 [===========================>..] - ETA: 0s - loss: 0.3696 - acc: 0.9000Epoch 00023: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.3626 - acc: 0.8986 - val_loss: 1.6812 - val_acc: 0.5965
lr: 0.001
Epoch 25/200
140/148 [===========================>..] - ETA: 0s - loss: 0.3574 - acc: 0.8571Epoch 00024: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.3628 - acc: 0.8581 - val_loss: 1.4499 - val_acc: 0.5614
lr: 0.001
Epoch 26/200
140/148 [===========================>..] - ETA: 0s - loss: 0.4923 - acc: 0.8214Epoch 00025: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.4856 - acc: 0.8243 - val_loss: 1.4156 - val_acc: 0.6491
lr: 0.001
Epoch 27/200
140/148 [===========================>..] - ETA: 0s - loss: 0.3919 - acc: 0.8643Epoch 00026: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.4277 - acc: 0.8446 - val_loss: 1.6682 - val_acc: 0.6667
lr: 0.001
Epoch 28/200
140/148 [===========================>..] - ETA: 0s - loss: 0.3867 - acc: 0.9286Epoch 00027: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.3794 - acc: 0.9257 - val_loss: 1.3040 - val_acc: 0.6842
lr: 0.001
Epoch 29/200
140/148 [===========================>..] - ETA: 0s - loss: 0.2188 - acc: 0.8929Epoch 00028: val_loss improved from 0.95162 to 0.76887, saving model to my_model_1.h5
148/148 [==============================] - 2s - loss: 0.2145 - acc: 0.8986 - val_loss: 0.7689 - val_acc: 0.7193
lr: 0.001
Epoch 30/200
140/148 [===========================>..] - ETA: 0s - loss: 0.3132 - acc: 0.9143Epoch 00029: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.3205 - acc: 0.9054 - val_loss: 1.5744 - val_acc: 0.6491
lr: 0.001
Epoch 31/200
140/148 [===========================>..] - ETA: 0s - loss: 0.3262 - acc: 0.9000Epoch 00030: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.3455 - acc: 0.8919 - val_loss: 1.4366 - val_acc: 0.4912
lr: 0.001
Epoch 32/200
140/148 [===========================>..] - ETA: 0s - loss: 0.1370 - acc: 0.9500Epoch 00031: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.1674 - acc: 0.9392 - val_loss: 4.2838 - val_acc: 0.2281
lr: 0.001
Epoch 33/200
140/148 [===========================>..] - ETA: 0s - loss: 0.7995 - acc: 0.8714Epoch 00032: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.7617 - acc: 0.8784 - val_loss: 1.0332 - val_acc: 0.6667
lr: 0.001
Epoch 34/200
140/148 [===========================>..] - ETA: 0s - loss: 0.1582 - acc: 0.9571Epoch 00033: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.1573 - acc: 0.9527 - val_loss: 1.0501 - val_acc: 0.7018
lr: 0.001
Epoch 35/200
140/148 [===========================>..] - ETA: 0s - loss: 0.4695 - acc: 0.8714Epoch 00034: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.4532 - acc: 0.8784 - val_loss: 0.8000 - val_acc: 0.6842
lr: 0.001
Epoch 36/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0893 - acc: 0.9643Epoch 00035: val_loss improved from 0.76887 to 0.75875, saving model to my_model_1.h5
148/148 [==============================] - 2s - loss: 0.0861 - acc: 0.9662 - val_loss: 0.7587 - val_acc: 0.7544
lr: 0.001
Epoch 37/200
140/148 [===========================>..] - ETA: 0s - loss: 0.1787 - acc: 0.9643Epoch 00036: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.2268 - acc: 0.9392 - val_loss: 2.8882 - val_acc: 0.5088
lr: 0.001
Epoch 38/200
140/148 [===========================>..] - ETA: 0s - loss: 0.5084 - acc: 0.8929Epoch 00037: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.5103 - acc: 0.8919 - val_loss: 0.9283 - val_acc: 0.7193
lr: 0.001
Epoch 39/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0593 - acc: 0.9857Epoch 00038: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0622 - acc: 0.9865 - val_loss: 0.9243 - val_acc: 0.7193
lr: 0.001
Epoch 40/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0422 - acc: 0.9786Epoch 00039: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0410 - acc: 0.9797 - val_loss: 0.7597 - val_acc: 0.7368
lr: 0.001
Epoch 41/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0431 - acc: 0.9857Epoch 00040: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0530 - acc: 0.9865 - val_loss: 2.9414 - val_acc: 0.5789
lr: 0.001
Epoch 42/200
140/148 [===========================>..] - ETA: 0s - loss: 0.6134 - acc: 0.8143Epoch 00041: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.6409 - acc: 0.8041 - val_loss: 2.3252 - val_acc: 0.4737
lr: 0.001
Epoch 43/200
140/148 [===========================>..] - ETA: 0s - loss: 0.3225 - acc: 0.9143Epoch 00042: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.3061 - acc: 0.9189 - val_loss: 0.7795 - val_acc: 0.7544
lr: 0.001
Epoch 44/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0089 - acc: 1.0000Epoch 00043: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0096 - acc: 1.0000 - val_loss: 0.8643 - val_acc: 0.7544
lr: 0.001
Epoch 45/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0151 - acc: 1.0000Epoch 00044: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0196 - acc: 1.0000 - val_loss: 5.5572 - val_acc: 0.1930
lr: 0.001
Epoch 46/200
140/148 [===========================>..] - ETA: 0s - loss: 0.9440 - acc: 0.8214Epoch 00045: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.8977 - acc: 0.8311 - val_loss: 1.0412 - val_acc: 0.7544
lr: 0.001
Epoch 47/200
140/148 [===========================>..] - ETA: 0s - loss: 0.3058 - acc: 0.8786Epoch 00046: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.3000 - acc: 0.8784 - val_loss: 0.9982 - val_acc: 0.7368
lr: 0.0009
Epoch 48/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0923 - acc: 0.9714Epoch 00047: val_loss improved from 0.75875 to 0.74206, saving model to my_model_1.h5
148/148 [==============================] - 3s - loss: 0.0906 - acc: 0.9730 - val_loss: 0.7421 - val_acc: 0.7895
lr: 0.0009
Epoch 49/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0100 - acc: 1.0000Epoch 00048: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0148 - acc: 1.0000 - val_loss: 1.5958 - val_acc: 0.6140
lr: 0.0009
Epoch 50/200
140/148 [===========================>..] - ETA: 0s - loss: 0.3089 - acc: 0.9143Epoch 00049: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.3225 - acc: 0.9122 - val_loss: 1.1923 - val_acc: 0.6491
lr: 0.0009
Epoch 51/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0861 - acc: 0.9786Epoch 00050: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0864 - acc: 0.9797 - val_loss: 1.3141 - val_acc: 0.6667
lr: 0.0009
Epoch 52/200
140/148 [===========================>..] - ETA: 0s - loss: 0.2229 - acc: 0.9143Epoch 00051: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.2397 - acc: 0.9122 - val_loss: 3.9695 - val_acc: 0.4386
lr: 0.0009
Epoch 53/200
140/148 [===========================>..] - ETA: 0s - loss: 0.5710 - acc: 0.9143Epoch 00052: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.5882 - acc: 0.9054 - val_loss: 3.1563 - val_acc: 0.3158
lr: 0.0009
Epoch 54/200
140/148 [===========================>..] - ETA: 0s - loss: 0.7099 - acc: 0.8500Epoch 00053: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.6744 - acc: 0.8581 - val_loss: 0.7475 - val_acc: 0.7895
lr: 0.0009
Epoch 55/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0173 - acc: 0.9929Epoch 00054: val_loss improved from 0.74206 to 0.64612, saving model to my_model_1.h5
148/148 [==============================] - 2s - loss: 0.0165 - acc: 0.9932 - val_loss: 0.6461 - val_acc: 0.8070
lr: 0.0009
Epoch 56/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0332 - acc: 0.9857Epoch 00055: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0316 - acc: 0.9865 - val_loss: 0.7501 - val_acc: 0.8070
lr: 0.0009
Epoch 57/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0092 - acc: 1.0000Epoch 00056: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0203 - acc: 0.9932 - val_loss: 2.7440 - val_acc: 0.4386
lr: 0.0009
Epoch 58/200
140/148 [===========================>..] - ETA: 0s - loss: 0.1433 - acc: 0.9714Epoch 00057: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.1475 - acc: 0.9662 - val_loss: 2.1617 - val_acc: 0.6140
lr: 0.0009
Epoch 59/200
140/148 [===========================>..] - ETA: 0s - loss: 0.4252 - acc: 0.8714Epoch 00058: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.4031 - acc: 0.8784 - val_loss: 0.9201 - val_acc: 0.7544
lr: 0.0009
Epoch 60/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0250 - acc: 0.9857Epoch 00059: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0324 - acc: 0.9797 - val_loss: 2.4088 - val_acc: 0.6316
lr: 0.0009
Epoch 61/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0987 - acc: 0.9714Epoch 00060: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.1013 - acc: 0.9730 - val_loss: 2.1570 - val_acc: 0.6491
lr: 0.0009
Epoch 62/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0684 - acc: 0.9857Epoch 00061: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0792 - acc: 0.9797 - val_loss: 7.0667 - val_acc: 0.2281
lr: 0.0009
Epoch 63/200
140/148 [===========================>..] - ETA: 0s - loss: 0.6187 - acc: 0.8857Epoch 00062: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.5886 - acc: 0.8919 - val_loss: 1.2845 - val_acc: 0.7368
lr: 0.0009
Epoch 64/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0150 - acc: 0.9929Epoch 00063: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0150 - acc: 0.9932 - val_loss: 0.8774 - val_acc: 0.8070
lr: 0.0009
Epoch 65/200
140/148 [===========================>..] - ETA: 0s - loss: 0.1398 - acc: 0.9500Epoch 00064: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.1325 - acc: 0.9527 - val_loss: 0.6673 - val_acc: 0.8246
lr: 0.0009
Epoch 66/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0077 - acc: 0.9929Epoch 00065: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0649 - acc: 0.9797 - val_loss: 1.8855 - val_acc: 0.6316
lr: 0.00081
Epoch 67/200
140/148 [===========================>..] - ETA: 0s - loss: 0.1534 - acc: 0.9643Epoch 00066: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.1455 - acc: 0.9662 - val_loss: 0.8613 - val_acc: 0.7895
lr: 0.00081
Epoch 68/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0063 - acc: 1.0000Epoch 00067: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0157 - acc: 0.9932 - val_loss: 2.2106 - val_acc: 0.6140
lr: 0.00081
Epoch 69/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0402 - acc: 0.9857Epoch 00068: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0600 - acc: 0.9797 - val_loss: 1.9528 - val_acc: 0.6316
lr: 0.00081
Epoch 70/200
140/148 [===========================>..] - ETA: 0s - loss: 0.2714 - acc: 0.9000Epoch 00069: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.2608 - acc: 0.9054 - val_loss: 2.7799 - val_acc: 0.4737
lr: 0.00081
Epoch 71/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0122 - acc: 1.0000Epoch 00070: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0161 - acc: 1.0000 - val_loss: 1.7144 - val_acc: 0.6667
lr: 0.00081
Epoch 72/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0702 - acc: 0.9857Epoch 00071: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0668 - acc: 0.9865 - val_loss: 1.0811 - val_acc: 0.7018
lr: 0.00081
Epoch 73/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0088 - acc: 1.0000Epoch 00072: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0102 - acc: 1.0000 - val_loss: 3.1226 - val_acc: 0.6316
lr: 0.00081
Epoch 74/200
140/148 [===========================>..] - ETA: 0s - loss: 0.1692 - acc: 0.9357Epoch 00073: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.1607 - acc: 0.9392 - val_loss: 0.9560 - val_acc: 0.7544
lr: 0.00081
Epoch 75/200
140/148 [===========================>..] - ETA: 0s - loss: 0.2578 - acc: 0.9214Epoch 00074: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.4876 - acc: 0.8919 - val_loss: 1.0673 - val_acc: 0.7018
lr: 0.00081
Epoch 76/200
140/148 [===========================>..] - ETA: 0s - loss: 0.1060 - acc: 0.9714Epoch 00075: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.1035 - acc: 0.9730 - val_loss: 0.7133 - val_acc: 0.7719
lr: 0.000729
Epoch 77/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0087 - acc: 1.0000Epoch 00076: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0100 - acc: 1.0000 - val_loss: 0.7398 - val_acc: 0.7719
lr: 0.000729
Epoch 78/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0029 - acc: 1.0000Epoch 00077: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0039 - acc: 1.0000 - val_loss: 0.6619 - val_acc: 0.7895
lr: 0.000729
Epoch 79/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0024 - acc: 1.0000Epoch 00078: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0024 - acc: 1.0000 - val_loss: 0.8401 - val_acc: 0.7544
lr: 0.000729
Epoch 80/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0024 - acc: 1.0000Epoch 00079: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0023 - acc: 1.0000 - val_loss: 1.1338 - val_acc: 0.7193
lr: 0.000729
Epoch 81/200
140/148 [===========================>..] - ETA: 0s - loss: 7.4402e-04 - acc: 1.0000Epoch 00080: val_loss did not improve
148/148 [==============================] - 1s - loss: 9.6096e-04 - acc: 1.0000 - val_loss: 1.0859 - val_acc: 0.7368
lr: 0.000729
Epoch 82/200
140/148 [===========================>..] - ETA: 0s - loss: 0.4590 - acc: 0.9143Epoch 00081: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.4352 - acc: 0.9189 - val_loss: 0.7069 - val_acc: 0.7719
lr: 0.000729
Epoch 83/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0104 - acc: 0.9929Epoch 00082: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0101 - acc: 0.9932 - val_loss: 0.7483 - val_acc: 0.7719
lr: 0.000729
Epoch 84/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0384 - acc: 0.9929 Epoch 00083: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0555 - acc: 0.9865 - val_loss: 0.7822 - val_acc: 0.7544
lr: 0.000729
Epoch 85/200
140/148 [===========================>..] - ETA: 0s - loss: 0.0155 - acc: 1.0000Epoch 00084: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.0170 - acc: 1.0000 - val_loss: 2.1915 - val_acc: 0.6842
lr: 0.000729
Epoch 86/200
140/148 [===========================>..] - ETA: 0s - loss: 0.1047 - acc: 0.9714Epoch 00085: val_loss did not improve
148/148 [==============================] - 1s - loss: 0.1268 - acc: 0.9662 - val_loss: 1.3860 - val_acc: 0.6842
Epoch 00085: early stopping
In [28]:
## TODO: Visualize the training and validation loss of your neural network
import matplotlib.pyplot as plt
def plt_hist(hist):
    print(hist.history.keys())
    # summarize history for accuracy
    plt.plot(hist.history['acc'])
    plt.plot(hist.history['val_acc'])
    plt.title('model accuracy')
    plt.ylabel('accuracy')
    plt.xlabel('epoch')
    plt.legend(['train', 'validation'], loc='upper left')
    plt.show()
    # summarize history for loss
    plt.plot(hist.history['loss'])
    plt.plot(hist.history['val_loss'])
    plt.title('model loss')
    plt.ylabel('loss')
    plt.xlabel('epoch')
    plt.legend(['train', 'validation'], loc='upper left')
    plt.show()
In [29]:
plt_hist(hist_1)
dict_keys(['loss', 'val_loss', 'val_acc', 'acc', 'lr'])
In [42]:
my_model_1 = load_model('./my_model_1.h5')
score = my_model_1.evaluate(test_tensors, test_targets, verbose=1)
print('Test loss:', score[0])
print('Test accuracy:', score[1])
58/58 [==============================] - 3s     
Test loss: 0.36949106126
Test accuracy: 0.844827617037

The testing accuracy is 84.5%, lower than I expected as there is only 7 classes. I tried to fine tune the number of layers and neurons in the dense layers, but didn't improve much. Maybe it's due to too limited number of images.

Part 3 - Improve accuracy with more images

More images were added to existing folder, just need to refresh the tensors and models.

In [79]:
# Load the list of images and categories
train_faces, train_targets = load_dataset('./images/Train')
test_faces, test_targets = load_dataset('./images/Test')
validate_faces, validate_targets = load_dataset('./images/Validate')

face_names = [item[15:-1] for item in glob('./images/Train/*/')]

print(face_names)
print('There are %d face categories.' % len(face_names))
print('There are %d total faces.' % len(np.hstack([train_faces, test_faces, validate_faces])))
print('There are %d training faces.' % len(train_faces))
print('There are %d test faces.' % len(test_faces))
print('There are %d validate faces.' % len(validate_faces))
['Brother', 'Dad', 'Daughter', 'Me', 'Mum', 'Son', 'Wife']
There are 7 face categories.
There are 428 total faces.
There are 257 training faces.
There are 98 test faces.
There are 73 validate faces.
In [82]:
print(train_faces[0], train_targets[0])
print(np.argmax(train_targets[0]))
print(face_names[4])
./images/Train\Mum\20160805-2119.jpg-face-0.jpg [ 0.  0.  0.  0.  1.  0.  0.]
4
Mum
In [44]:
train_tensors = paths_to_tensor(train_faces).astype('float32')/255
test_tensors = paths_to_tensor(test_faces).astype('float32')/255
validate_tensors = paths_to_tensor(validate_faces).astype('float32')/255
print("Train tensor shape.", train_tensors.shape)
print('Test tensor shape.', test_tensors.shape)
print('Validate tensor shape.', validate_tensors.shape)
Train tensor shape. (257, 299, 299, 3)
Test tensor shape. (98, 299, 299, 3)
Validate tensor shape. (73, 299, 299, 3)
In [47]:
x = base_model.output
x = GlobalAveragePooling2D()(x)
x = Dense(1000, activation = 'relu')(x)
x = Dense(200, activation = 'relu')(x)
x = Dense(50, activation = 'relu')(x)
x = Dense(7, activation = 'softmax')(x)

my_model_2 = Model(inputs = base_model.input, outputs = x)
my_model_2.summary()
____________________________________________________________________________________________________
Layer (type)                     Output Shape          Param #     Connected to                     
====================================================================================================
input_1 (InputLayer)             (None, None, None, 3) 0                                            
____________________________________________________________________________________________________
conv2d_1 (Conv2D)                (None, None, None, 32 864         input_1[0][0]                    
____________________________________________________________________________________________________
batch_normalization_1 (BatchNorm (None, None, None, 32 96          conv2d_1[0][0]                   
____________________________________________________________________________________________________
activation_1 (Activation)        (None, None, None, 32 0           batch_normalization_1[0][0]      
____________________________________________________________________________________________________
conv2d_2 (Conv2D)                (None, None, None, 32 9216        activation_1[0][0]               
____________________________________________________________________________________________________
batch_normalization_2 (BatchNorm (None, None, None, 32 96          conv2d_2[0][0]                   
____________________________________________________________________________________________________
activation_2 (Activation)        (None, None, None, 32 0           batch_normalization_2[0][0]      
____________________________________________________________________________________________________
conv2d_3 (Conv2D)                (None, None, None, 64 18432       activation_2[0][0]               
____________________________________________________________________________________________________
batch_normalization_3 (BatchNorm (None, None, None, 64 192         conv2d_3[0][0]                   
____________________________________________________________________________________________________
activation_3 (Activation)        (None, None, None, 64 0           batch_normalization_3[0][0]      
____________________________________________________________________________________________________
max_pooling2d_1 (MaxPooling2D)   (None, None, None, 64 0           activation_3[0][0]               
____________________________________________________________________________________________________
conv2d_4 (Conv2D)                (None, None, None, 80 5120        max_pooling2d_1[0][0]            
____________________________________________________________________________________________________
batch_normalization_4 (BatchNorm (None, None, None, 80 240         conv2d_4[0][0]                   
____________________________________________________________________________________________________
activation_4 (Activation)        (None, None, None, 80 0           batch_normalization_4[0][0]      
____________________________________________________________________________________________________
conv2d_5 (Conv2D)                (None, None, None, 19 138240      activation_4[0][0]               
____________________________________________________________________________________________________
batch_normalization_5 (BatchNorm (None, None, None, 19 576         conv2d_5[0][0]                   
____________________________________________________________________________________________________
activation_5 (Activation)        (None, None, None, 19 0           batch_normalization_5[0][0]      
____________________________________________________________________________________________________
max_pooling2d_2 (MaxPooling2D)   (None, None, None, 19 0           activation_5[0][0]               
____________________________________________________________________________________________________
conv2d_9 (Conv2D)                (None, None, None, 64 12288       max_pooling2d_2[0][0]            
____________________________________________________________________________________________________
batch_normalization_9 (BatchNorm (None, None, None, 64 192         conv2d_9[0][0]                   
____________________________________________________________________________________________________
activation_9 (Activation)        (None, None, None, 64 0           batch_normalization_9[0][0]      
____________________________________________________________________________________________________
conv2d_7 (Conv2D)                (None, None, None, 48 9216        max_pooling2d_2[0][0]            
____________________________________________________________________________________________________
conv2d_10 (Conv2D)               (None, None, None, 96 55296       activation_9[0][0]               
____________________________________________________________________________________________________
batch_normalization_7 (BatchNorm (None, None, None, 48 144         conv2d_7[0][0]                   
____________________________________________________________________________________________________
batch_normalization_10 (BatchNor (None, None, None, 96 288         conv2d_10[0][0]                  
____________________________________________________________________________________________________
activation_7 (Activation)        (None, None, None, 48 0           batch_normalization_7[0][0]      
____________________________________________________________________________________________________
activation_10 (Activation)       (None, None, None, 96 0           batch_normalization_10[0][0]     
____________________________________________________________________________________________________
average_pooling2d_1 (AveragePool (None, None, None, 19 0           max_pooling2d_2[0][0]            
____________________________________________________________________________________________________
conv2d_6 (Conv2D)                (None, None, None, 64 12288       max_pooling2d_2[0][0]            
____________________________________________________________________________________________________
conv2d_8 (Conv2D)                (None, None, None, 64 76800       activation_7[0][0]               
____________________________________________________________________________________________________
conv2d_11 (Conv2D)               (None, None, None, 96 82944       activation_10[0][0]              
____________________________________________________________________________________________________
conv2d_12 (Conv2D)               (None, None, None, 32 6144        average_pooling2d_1[0][0]        
____________________________________________________________________________________________________
batch_normalization_6 (BatchNorm (None, None, None, 64 192         conv2d_6[0][0]                   
____________________________________________________________________________________________________
batch_normalization_8 (BatchNorm (None, None, None, 64 192         conv2d_8[0][0]                   
____________________________________________________________________________________________________
batch_normalization_11 (BatchNor (None, None, None, 96 288         conv2d_11[0][0]                  
____________________________________________________________________________________________________
batch_normalization_12 (BatchNor (None, None, None, 32 96          conv2d_12[0][0]                  
____________________________________________________________________________________________________
activation_6 (Activation)        (None, None, None, 64 0           batch_normalization_6[0][0]      
____________________________________________________________________________________________________
activation_8 (Activation)        (None, None, None, 64 0           batch_normalization_8[0][0]      
____________________________________________________________________________________________________
activation_11 (Activation)       (None, None, None, 96 0           batch_normalization_11[0][0]     
____________________________________________________________________________________________________
activation_12 (Activation)       (None, None, None, 32 0           batch_normalization_12[0][0]     
____________________________________________________________________________________________________
mixed0 (Concatenate)             (None, None, None, 25 0           activation_6[0][0]               
                                                                   activation_8[0][0]               
                                                                   activation_11[0][0]              
                                                                   activation_12[0][0]              
____________________________________________________________________________________________________
conv2d_16 (Conv2D)               (None, None, None, 64 16384       mixed0[0][0]                     
____________________________________________________________________________________________________
batch_normalization_16 (BatchNor (None, None, None, 64 192         conv2d_16[0][0]                  
____________________________________________________________________________________________________
activation_16 (Activation)       (None, None, None, 64 0           batch_normalization_16[0][0]     
____________________________________________________________________________________________________
conv2d_14 (Conv2D)               (None, None, None, 48 12288       mixed0[0][0]                     
____________________________________________________________________________________________________
conv2d_17 (Conv2D)               (None, None, None, 96 55296       activation_16[0][0]              
____________________________________________________________________________________________________
batch_normalization_14 (BatchNor (None, None, None, 48 144         conv2d_14[0][0]                  
____________________________________________________________________________________________________
batch_normalization_17 (BatchNor (None, None, None, 96 288         conv2d_17[0][0]                  
____________________________________________________________________________________________________
activation_14 (Activation)       (None, None, None, 48 0           batch_normalization_14[0][0]     
____________________________________________________________________________________________________
activation_17 (Activation)       (None, None, None, 96 0           batch_normalization_17[0][0]     
____________________________________________________________________________________________________
average_pooling2d_2 (AveragePool (None, None, None, 25 0           mixed0[0][0]                     
____________________________________________________________________________________________________
conv2d_13 (Conv2D)               (None, None, None, 64 16384       mixed0[0][0]                     
____________________________________________________________________________________________________
conv2d_15 (Conv2D)               (None, None, None, 64 76800       activation_14[0][0]              
____________________________________________________________________________________________________
conv2d_18 (Conv2D)               (None, None, None, 96 82944       activation_17[0][0]              
____________________________________________________________________________________________________
conv2d_19 (Conv2D)               (None, None, None, 64 16384       average_pooling2d_2[0][0]        
____________________________________________________________________________________________________
batch_normalization_13 (BatchNor (None, None, None, 64 192         conv2d_13[0][0]                  
____________________________________________________________________________________________________
batch_normalization_15 (BatchNor (None, None, None, 64 192         conv2d_15[0][0]                  
____________________________________________________________________________________________________
batch_normalization_18 (BatchNor (None, None, None, 96 288         conv2d_18[0][0]                  
____________________________________________________________________________________________________
batch_normalization_19 (BatchNor (None, None, None, 64 192         conv2d_19[0][0]                  
____________________________________________________________________________________________________
activation_13 (Activation)       (None, None, None, 64 0           batch_normalization_13[0][0]     
____________________________________________________________________________________________________
activation_15 (Activation)       (None, None, None, 64 0           batch_normalization_15[0][0]     
____________________________________________________________________________________________________
activation_18 (Activation)       (None, None, None, 96 0           batch_normalization_18[0][0]     
____________________________________________________________________________________________________
activation_19 (Activation)       (None, None, None, 64 0           batch_normalization_19[0][0]     
____________________________________________________________________________________________________
mixed1 (Concatenate)             (None, None, None, 28 0           activation_13[0][0]              
                                                                   activation_15[0][0]              
                                                                   activation_18[0][0]              
                                                                   activation_19[0][0]              
____________________________________________________________________________________________________
conv2d_23 (Conv2D)               (None, None, None, 64 18432       mixed1[0][0]                     
____________________________________________________________________________________________________
batch_normalization_23 (BatchNor (None, None, None, 64 192         conv2d_23[0][0]                  
____________________________________________________________________________________________________
activation_23 (Activation)       (None, None, None, 64 0           batch_normalization_23[0][0]     
____________________________________________________________________________________________________
conv2d_21 (Conv2D)               (None, None, None, 48 13824       mixed1[0][0]                     
____________________________________________________________________________________________________
conv2d_24 (Conv2D)               (None, None, None, 96 55296       activation_23[0][0]              
____________________________________________________________________________________________________
batch_normalization_21 (BatchNor (None, None, None, 48 144         conv2d_21[0][0]                  
____________________________________________________________________________________________________
batch_normalization_24 (BatchNor (None, None, None, 96 288         conv2d_24[0][0]                  
____________________________________________________________________________________________________
activation_21 (Activation)       (None, None, None, 48 0           batch_normalization_21[0][0]     
____________________________________________________________________________________________________
activation_24 (Activation)       (None, None, None, 96 0           batch_normalization_24[0][0]     
____________________________________________________________________________________________________
average_pooling2d_3 (AveragePool (None, None, None, 28 0           mixed1[0][0]                     
____________________________________________________________________________________________________
conv2d_20 (Conv2D)               (None, None, None, 64 18432       mixed1[0][0]                     
____________________________________________________________________________________________________
conv2d_22 (Conv2D)               (None, None, None, 64 76800       activation_21[0][0]              
____________________________________________________________________________________________________
conv2d_25 (Conv2D)               (None, None, None, 96 82944       activation_24[0][0]              
____________________________________________________________________________________________________
conv2d_26 (Conv2D)               (None, None, None, 64 18432       average_pooling2d_3[0][0]        
____________________________________________________________________________________________________
batch_normalization_20 (BatchNor (None, None, None, 64 192         conv2d_20[0][0]                  
____________________________________________________________________________________________________
batch_normalization_22 (BatchNor (None, None, None, 64 192         conv2d_22[0][0]                  
____________________________________________________________________________________________________
batch_normalization_25 (BatchNor (None, None, None, 96 288         conv2d_25[0][0]                  
____________________________________________________________________________________________________
batch_normalization_26 (BatchNor (None, None, None, 64 192         conv2d_26[0][0]                  
____________________________________________________________________________________________________
activation_20 (Activation)       (None, None, None, 64 0           batch_normalization_20[0][0]     
____________________________________________________________________________________________________
activation_22 (Activation)       (None, None, None, 64 0           batch_normalization_22[0][0]     
____________________________________________________________________________________________________
activation_25 (Activation)       (None, None, None, 96 0           batch_normalization_25[0][0]     
____________________________________________________________________________________________________
activation_26 (Activation)       (None, None, None, 64 0           batch_normalization_26[0][0]     
____________________________________________________________________________________________________
mixed2 (Concatenate)             (None, None, None, 28 0           activation_20[0][0]              
                                                                   activation_22[0][0]              
                                                                   activation_25[0][0]              
                                                                   activation_26[0][0]              
____________________________________________________________________________________________________
conv2d_28 (Conv2D)               (None, None, None, 64 18432       mixed2[0][0]                     
____________________________________________________________________________________________________
batch_normalization_28 (BatchNor (None, None, None, 64 192         conv2d_28[0][0]                  
____________________________________________________________________________________________________
activation_28 (Activation)       (None, None, None, 64 0           batch_normalization_28[0][0]     
____________________________________________________________________________________________________
conv2d_29 (Conv2D)               (None, None, None, 96 55296       activation_28[0][0]              
____________________________________________________________________________________________________
batch_normalization_29 (BatchNor (None, None, None, 96 288         conv2d_29[0][0]                  
____________________________________________________________________________________________________
activation_29 (Activation)       (None, None, None, 96 0           batch_normalization_29[0][0]     
____________________________________________________________________________________________________
conv2d_27 (Conv2D)               (None, None, None, 38 995328      mixed2[0][0]                     
____________________________________________________________________________________________________
conv2d_30 (Conv2D)               (None, None, None, 96 82944       activation_29[0][0]              
____________________________________________________________________________________________________
batch_normalization_27 (BatchNor (None, None, None, 38 1152        conv2d_27[0][0]                  
____________________________________________________________________________________________________
batch_normalization_30 (BatchNor (None, None, None, 96 288         conv2d_30[0][0]                  
____________________________________________________________________________________________________
activation_27 (Activation)       (None, None, None, 38 0           batch_normalization_27[0][0]     
____________________________________________________________________________________________________
activation_30 (Activation)       (None, None, None, 96 0           batch_normalization_30[0][0]     
____________________________________________________________________________________________________
max_pooling2d_3 (MaxPooling2D)   (None, None, None, 28 0           mixed2[0][0]                     
____________________________________________________________________________________________________
mixed3 (Concatenate)             (None, None, None, 76 0           activation_27[0][0]              
                                                                   activation_30[0][0]              
                                                                   max_pooling2d_3[0][0]            
____________________________________________________________________________________________________
conv2d_35 (Conv2D)               (None, None, None, 12 98304       mixed3[0][0]                     
____________________________________________________________________________________________________
batch_normalization_35 (BatchNor (None, None, None, 12 384         conv2d_35[0][0]                  
____________________________________________________________________________________________________
activation_35 (Activation)       (None, None, None, 12 0           batch_normalization_35[0][0]     
____________________________________________________________________________________________________
conv2d_36 (Conv2D)               (None, None, None, 12 114688      activation_35[0][0]              
____________________________________________________________________________________________________
batch_normalization_36 (BatchNor (None, None, None, 12 384         conv2d_36[0][0]                  
____________________________________________________________________________________________________
activation_36 (Activation)       (None, None, None, 12 0           batch_normalization_36[0][0]     
____________________________________________________________________________________________________
conv2d_32 (Conv2D)               (None, None, None, 12 98304       mixed3[0][0]                     
____________________________________________________________________________________________________
conv2d_37 (Conv2D)               (None, None, None, 12 114688      activation_36[0][0]              
____________________________________________________________________________________________________
batch_normalization_32 (BatchNor (None, None, None, 12 384         conv2d_32[0][0]                  
____________________________________________________________________________________________________
batch_normalization_37 (BatchNor (None, None, None, 12 384         conv2d_37[0][0]                  
____________________________________________________________________________________________________
activation_32 (Activation)       (None, None, None, 12 0           batch_normalization_32[0][0]     
____________________________________________________________________________________________________
activation_37 (Activation)       (None, None, None, 12 0           batch_normalization_37[0][0]     
____________________________________________________________________________________________________
conv2d_33 (Conv2D)               (None, None, None, 12 114688      activation_32[0][0]              
____________________________________________________________________________________________________
conv2d_38 (Conv2D)               (None, None, None, 12 114688      activation_37[0][0]              
____________________________________________________________________________________________________
batch_normalization_33 (BatchNor (None, None, None, 12 384         conv2d_33[0][0]                  
____________________________________________________________________________________________________
batch_normalization_38 (BatchNor (None, None, None, 12 384         conv2d_38[0][0]                  
____________________________________________________________________________________________________
activation_33 (Activation)       (None, None, None, 12 0           batch_normalization_33[0][0]     
____________________________________________________________________________________________________
activation_38 (Activation)       (None, None, None, 12 0           batch_normalization_38[0][0]     
____________________________________________________________________________________________________
average_pooling2d_4 (AveragePool (None, None, None, 76 0           mixed3[0][0]                     
____________________________________________________________________________________________________
conv2d_31 (Conv2D)               (None, None, None, 19 147456      mixed3[0][0]                     
____________________________________________________________________________________________________
conv2d_34 (Conv2D)               (None, None, None, 19 172032      activation_33[0][0]              
____________________________________________________________________________________________________
conv2d_39 (Conv2D)               (None, None, None, 19 172032      activation_38[0][0]              
____________________________________________________________________________________________________
conv2d_40 (Conv2D)               (None, None, None, 19 147456      average_pooling2d_4[0][0]        
____________________________________________________________________________________________________
batch_normalization_31 (BatchNor (None, None, None, 19 576         conv2d_31[0][0]                  
____________________________________________________________________________________________________
batch_normalization_34 (BatchNor (None, None, None, 19 576         conv2d_34[0][0]                  
____________________________________________________________________________________________________
batch_normalization_39 (BatchNor (None, None, None, 19 576         conv2d_39[0][0]                  
____________________________________________________________________________________________________
batch_normalization_40 (BatchNor (None, None, None, 19 576         conv2d_40[0][0]                  
____________________________________________________________________________________________________
activation_31 (Activation)       (None, None, None, 19 0           batch_normalization_31[0][0]     
____________________________________________________________________________________________________
activation_34 (Activation)       (None, None, None, 19 0           batch_normalization_34[0][0]     
____________________________________________________________________________________________________
activation_39 (Activation)       (None, None, None, 19 0           batch_normalization_39[0][0]     
____________________________________________________________________________________________________
activation_40 (Activation)       (None, None, None, 19 0           batch_normalization_40[0][0]     
____________________________________________________________________________________________________
mixed4 (Concatenate)             (None, None, None, 76 0           activation_31[0][0]              
                                                                   activation_34[0][0]              
                                                                   activation_39[0][0]              
                                                                   activation_40[0][0]              
____________________________________________________________________________________________________
conv2d_45 (Conv2D)               (None, None, None, 16 122880      mixed4[0][0]                     
____________________________________________________________________________________________________
batch_normalization_45 (BatchNor (None, None, None, 16 480         conv2d_45[0][0]                  
____________________________________________________________________________________________________
activation_45 (Activation)       (None, None, None, 16 0           batch_normalization_45[0][0]     
____________________________________________________________________________________________________
conv2d_46 (Conv2D)               (None, None, None, 16 179200      activation_45[0][0]              
____________________________________________________________________________________________________
batch_normalization_46 (BatchNor (None, None, None, 16 480         conv2d_46[0][0]                  
____________________________________________________________________________________________________
activation_46 (Activation)       (None, None, None, 16 0           batch_normalization_46[0][0]     
____________________________________________________________________________________________________
conv2d_42 (Conv2D)               (None, None, None, 16 122880      mixed4[0][0]                     
____________________________________________________________________________________________________
conv2d_47 (Conv2D)               (None, None, None, 16 179200      activation_46[0][0]              
____________________________________________________________________________________________________
batch_normalization_42 (BatchNor (None, None, None, 16 480         conv2d_42[0][0]                  
____________________________________________________________________________________________________
batch_normalization_47 (BatchNor (None, None, None, 16 480         conv2d_47[0][0]                  
____________________________________________________________________________________________________
activation_42 (Activation)       (None, None, None, 16 0           batch_normalization_42[0][0]     
____________________________________________________________________________________________________
activation_47 (Activation)       (None, None, None, 16 0           batch_normalization_47[0][0]     
____________________________________________________________________________________________________
conv2d_43 (Conv2D)               (None, None, None, 16 179200      activation_42[0][0]              
____________________________________________________________________________________________________
conv2d_48 (Conv2D)               (None, None, None, 16 179200      activation_47[0][0]              
____________________________________________________________________________________________________
batch_normalization_43 (BatchNor (None, None, None, 16 480         conv2d_43[0][0]                  
____________________________________________________________________________________________________
batch_normalization_48 (BatchNor (None, None, None, 16 480         conv2d_48[0][0]                  
____________________________________________________________________________________________________
activation_43 (Activation)       (None, None, None, 16 0           batch_normalization_43[0][0]     
____________________________________________________________________________________________________
activation_48 (Activation)       (None, None, None, 16 0           batch_normalization_48[0][0]     
____________________________________________________________________________________________________
average_pooling2d_5 (AveragePool (None, None, None, 76 0           mixed4[0][0]                     
____________________________________________________________________________________________________
conv2d_41 (Conv2D)               (None, None, None, 19 147456      mixed4[0][0]                     
____________________________________________________________________________________________________
conv2d_44 (Conv2D)               (None, None, None, 19 215040      activation_43[0][0]              
____________________________________________________________________________________________________
conv2d_49 (Conv2D)               (None, None, None, 19 215040      activation_48[0][0]              
____________________________________________________________________________________________________
conv2d_50 (Conv2D)               (None, None, None, 19 147456      average_pooling2d_5[0][0]        
____________________________________________________________________________________________________
batch_normalization_41 (BatchNor (None, None, None, 19 576         conv2d_41[0][0]                  
____________________________________________________________________________________________________
batch_normalization_44 (BatchNor (None, None, None, 19 576         conv2d_44[0][0]                  
____________________________________________________________________________________________________
batch_normalization_49 (BatchNor (None, None, None, 19 576         conv2d_49[0][0]                  
____________________________________________________________________________________________________
batch_normalization_50 (BatchNor (None, None, None, 19 576         conv2d_50[0][0]                  
____________________________________________________________________________________________________
activation_41 (Activation)       (None, None, None, 19 0           batch_normalization_41[0][0]     
____________________________________________________________________________________________________
activation_44 (Activation)       (None, None, None, 19 0           batch_normalization_44[0][0]     
____________________________________________________________________________________________________
activation_49 (Activation)       (None, None, None, 19 0           batch_normalization_49[0][0]     
____________________________________________________________________________________________________
activation_50 (Activation)       (None, None, None, 19 0           batch_normalization_50[0][0]     
____________________________________________________________________________________________________
mixed5 (Concatenate)             (None, None, None, 76 0           activation_41[0][0]              
                                                                   activation_44[0][0]              
                                                                   activation_49[0][0]              
                                                                   activation_50[0][0]              
____________________________________________________________________________________________________
conv2d_55 (Conv2D)               (None, None, None, 16 122880      mixed5[0][0]                     
____________________________________________________________________________________________________
batch_normalization_55 (BatchNor (None, None, None, 16 480         conv2d_55[0][0]                  
____________________________________________________________________________________________________
activation_55 (Activation)       (None, None, None, 16 0           batch_normalization_55[0][0]     
____________________________________________________________________________________________________
conv2d_56 (Conv2D)               (None, None, None, 16 179200      activation_55[0][0]              
____________________________________________________________________________________________________
batch_normalization_56 (BatchNor (None, None, None, 16 480         conv2d_56[0][0]                  
____________________________________________________________________________________________________
activation_56 (Activation)       (None, None, None, 16 0           batch_normalization_56[0][0]     
____________________________________________________________________________________________________
conv2d_52 (Conv2D)               (None, None, None, 16 122880      mixed5[0][0]                     
____________________________________________________________________________________________________
conv2d_57 (Conv2D)               (None, None, None, 16 179200      activation_56[0][0]              
____________________________________________________________________________________________________
batch_normalization_52 (BatchNor (None, None, None, 16 480         conv2d_52[0][0]                  
____________________________________________________________________________________________________
batch_normalization_57 (BatchNor (None, None, None, 16 480         conv2d_57[0][0]                  
____________________________________________________________________________________________________
activation_52 (Activation)       (None, None, None, 16 0           batch_normalization_52[0][0]     
____________________________________________________________________________________________________
activation_57 (Activation)       (None, None, None, 16 0           batch_normalization_57[0][0]     
____________________________________________________________________________________________________
conv2d_53 (Conv2D)               (None, None, None, 16 179200      activation_52[0][0]              
____________________________________________________________________________________________________
conv2d_58 (Conv2D)               (None, None, None, 16 179200      activation_57[0][0]              
____________________________________________________________________________________________________
batch_normalization_53 (BatchNor (None, None, None, 16 480         conv2d_53[0][0]                  
____________________________________________________________________________________________________
batch_normalization_58 (BatchNor (None, None, None, 16 480         conv2d_58[0][0]                  
____________________________________________________________________________________________________
activation_53 (Activation)       (None, None, None, 16 0           batch_normalization_53[0][0]     
____________________________________________________________________________________________________
activation_58 (Activation)       (None, None, None, 16 0           batch_normalization_58[0][0]     
____________________________________________________________________________________________________
average_pooling2d_6 (AveragePool (None, None, None, 76 0           mixed5[0][0]                     
____________________________________________________________________________________________________
conv2d_51 (Conv2D)               (None, None, None, 19 147456      mixed5[0][0]                     
____________________________________________________________________________________________________
conv2d_54 (Conv2D)               (None, None, None, 19 215040      activation_53[0][0]              
____________________________________________________________________________________________________
conv2d_59 (Conv2D)               (None, None, None, 19 215040      activation_58[0][0]              
____________________________________________________________________________________________________
conv2d_60 (Conv2D)               (None, None, None, 19 147456      average_pooling2d_6[0][0]        
____________________________________________________________________________________________________
batch_normalization_51 (BatchNor (None, None, None, 19 576         conv2d_51[0][0]                  
____________________________________________________________________________________________________
batch_normalization_54 (BatchNor (None, None, None, 19 576         conv2d_54[0][0]                  
____________________________________________________________________________________________________
batch_normalization_59 (BatchNor (None, None, None, 19 576         conv2d_59[0][0]                  
____________________________________________________________________________________________________
batch_normalization_60 (BatchNor (None, None, None, 19 576         conv2d_60[0][0]                  
____________________________________________________________________________________________________
activation_51 (Activation)       (None, None, None, 19 0           batch_normalization_51[0][0]     
____________________________________________________________________________________________________
activation_54 (Activation)       (None, None, None, 19 0           batch_normalization_54[0][0]     
____________________________________________________________________________________________________
activation_59 (Activation)       (None, None, None, 19 0           batch_normalization_59[0][0]     
____________________________________________________________________________________________________
activation_60 (Activation)       (None, None, None, 19 0           batch_normalization_60[0][0]     
____________________________________________________________________________________________________
mixed6 (Concatenate)             (None, None, None, 76 0           activation_51[0][0]              
                                                                   activation_54[0][0]              
                                                                   activation_59[0][0]              
                                                                   activation_60[0][0]              
____________________________________________________________________________________________________
conv2d_65 (Conv2D)               (None, None, None, 19 147456      mixed6[0][0]                     
____________________________________________________________________________________________________
batch_normalization_65 (BatchNor (None, None, None, 19 576         conv2d_65[0][0]                  
____________________________________________________________________________________________________
activation_65 (Activation)       (None, None, None, 19 0           batch_normalization_65[0][0]     
____________________________________________________________________________________________________
conv2d_66 (Conv2D)               (None, None, None, 19 258048      activation_65[0][0]              
____________________________________________________________________________________________________
batch_normalization_66 (BatchNor (None, None, None, 19 576         conv2d_66[0][0]                  
____________________________________________________________________________________________________
activation_66 (Activation)       (None, None, None, 19 0           batch_normalization_66[0][0]     
____________________________________________________________________________________________________
conv2d_62 (Conv2D)               (None, None, None, 19 147456      mixed6[0][0]                     
____________________________________________________________________________________________________
conv2d_67 (Conv2D)               (None, None, None, 19 258048      activation_66[0][0]              
____________________________________________________________________________________________________
batch_normalization_62 (BatchNor (None, None, None, 19 576         conv2d_62[0][0]                  
____________________________________________________________________________________________________
batch_normalization_67 (BatchNor (None, None, None, 19 576         conv2d_67[0][0]                  
____________________________________________________________________________________________________
activation_62 (Activation)       (None, None, None, 19 0           batch_normalization_62[0][0]     
____________________________________________________________________________________________________
activation_67 (Activation)       (None, None, None, 19 0           batch_normalization_67[0][0]     
____________________________________________________________________________________________________
conv2d_63 (Conv2D)               (None, None, None, 19 258048      activation_62[0][0]              
____________________________________________________________________________________________________
conv2d_68 (Conv2D)               (None, None, None, 19 258048      activation_67[0][0]              
____________________________________________________________________________________________________
batch_normalization_63 (BatchNor (None, None, None, 19 576         conv2d_63[0][0]                  
____________________________________________________________________________________________________
batch_normalization_68 (BatchNor (None, None, None, 19 576         conv2d_68[0][0]                  
____________________________________________________________________________________________________
activation_63 (Activation)       (None, None, None, 19 0           batch_normalization_63[0][0]     
____________________________________________________________________________________________________
activation_68 (Activation)       (None, None, None, 19 0           batch_normalization_68[0][0]     
____________________________________________________________________________________________________
average_pooling2d_7 (AveragePool (None, None, None, 76 0           mixed6[0][0]                     
____________________________________________________________________________________________________
conv2d_61 (Conv2D)               (None, None, None, 19 147456      mixed6[0][0]                     
____________________________________________________________________________________________________
conv2d_64 (Conv2D)               (None, None, None, 19 258048      activation_63[0][0]              
____________________________________________________________________________________________________
conv2d_69 (Conv2D)               (None, None, None, 19 258048      activation_68[0][0]              
____________________________________________________________________________________________________
conv2d_70 (Conv2D)               (None, None, None, 19 147456      average_pooling2d_7[0][0]        
____________________________________________________________________________________________________
batch_normalization_61 (BatchNor (None, None, None, 19 576         conv2d_61[0][0]                  
____________________________________________________________________________________________________
batch_normalization_64 (BatchNor (None, None, None, 19 576         conv2d_64[0][0]                  
____________________________________________________________________________________________________
batch_normalization_69 (BatchNor (None, None, None, 19 576         conv2d_69[0][0]                  
____________________________________________________________________________________________________
batch_normalization_70 (BatchNor (None, None, None, 19 576         conv2d_70[0][0]                  
____________________________________________________________________________________________________
activation_61 (Activation)       (None, None, None, 19 0           batch_normalization_61[0][0]     
____________________________________________________________________________________________________
activation_64 (Activation)       (None, None, None, 19 0           batch_normalization_64[0][0]     
____________________________________________________________________________________________________
activation_69 (Activation)       (None, None, None, 19 0           batch_normalization_69[0][0]     
____________________________________________________________________________________________________
activation_70 (Activation)       (None, None, None, 19 0           batch_normalization_70[0][0]     
____________________________________________________________________________________________________
mixed7 (Concatenate)             (None, None, None, 76 0           activation_61[0][0]              
                                                                   activation_64[0][0]              
                                                                   activation_69[0][0]              
                                                                   activation_70[0][0]              
____________________________________________________________________________________________________
conv2d_73 (Conv2D)               (None, None, None, 19 147456      mixed7[0][0]                     
____________________________________________________________________________________________________
batch_normalization_73 (BatchNor (None, None, None, 19 576         conv2d_73[0][0]                  
____________________________________________________________________________________________________
activation_73 (Activation)       (None, None, None, 19 0           batch_normalization_73[0][0]     
____________________________________________________________________________________________________
conv2d_74 (Conv2D)               (None, None, None, 19 258048      activation_73[0][0]              
____________________________________________________________________________________________________
batch_normalization_74 (BatchNor (None, None, None, 19 576         conv2d_74[0][0]                  
____________________________________________________________________________________________________
activation_74 (Activation)       (None, None, None, 19 0           batch_normalization_74[0][0]     
____________________________________________________________________________________________________
conv2d_71 (Conv2D)               (None, None, None, 19 147456      mixed7[0][0]                     
____________________________________________________________________________________________________
conv2d_75 (Conv2D)               (None, None, None, 19 258048      activation_74[0][0]              
____________________________________________________________________________________________________
batch_normalization_71 (BatchNor (None, None, None, 19 576         conv2d_71[0][0]                  
____________________________________________________________________________________________________
batch_normalization_75 (BatchNor (None, None, None, 19 576         conv2d_75[0][0]                  
____________________________________________________________________________________________________
activation_71 (Activation)       (None, None, None, 19 0           batch_normalization_71[0][0]     
____________________________________________________________________________________________________
activation_75 (Activation)       (None, None, None, 19 0           batch_normalization_75[0][0]     
____________________________________________________________________________________________________
conv2d_72 (Conv2D)               (None, None, None, 32 552960      activation_71[0][0]              
____________________________________________________________________________________________________
conv2d_76 (Conv2D)               (None, None, None, 19 331776      activation_75[0][0]              
____________________________________________________________________________________________________
batch_normalization_72 (BatchNor (None, None, None, 32 960         conv2d_72[0][0]                  
____________________________________________________________________________________________________
batch_normalization_76 (BatchNor (None, None, None, 19 576         conv2d_76[0][0]                  
____________________________________________________________________________________________________
activation_72 (Activation)       (None, None, None, 32 0           batch_normalization_72[0][0]     
____________________________________________________________________________________________________
activation_76 (Activation)       (None, None, None, 19 0           batch_normalization_76[0][0]     
____________________________________________________________________________________________________
max_pooling2d_4 (MaxPooling2D)   (None, None, None, 76 0           mixed7[0][0]                     
____________________________________________________________________________________________________
mixed8 (Concatenate)             (None, None, None, 12 0           activation_72[0][0]              
                                                                   activation_76[0][0]              
                                                                   max_pooling2d_4[0][0]            
____________________________________________________________________________________________________
conv2d_81 (Conv2D)               (None, None, None, 44 573440      mixed8[0][0]                     
____________________________________________________________________________________________________
batch_normalization_81 (BatchNor (None, None, None, 44 1344        conv2d_81[0][0]                  
____________________________________________________________________________________________________
activation_81 (Activation)       (None, None, None, 44 0           batch_normalization_81[0][0]     
____________________________________________________________________________________________________
conv2d_78 (Conv2D)               (None, None, None, 38 491520      mixed8[0][0]                     
____________________________________________________________________________________________________
conv2d_82 (Conv2D)               (None, None, None, 38 1548288     activation_81[0][0]              
____________________________________________________________________________________________________
batch_normalization_78 (BatchNor (None, None, None, 38 1152        conv2d_78[0][0]                  
____________________________________________________________________________________________________
batch_normalization_82 (BatchNor (None, None, None, 38 1152        conv2d_82[0][0]                  
____________________________________________________________________________________________________
activation_78 (Activation)       (None, None, None, 38 0           batch_normalization_78[0][0]     
____________________________________________________________________________________________________
activation_82 (Activation)       (None, None, None, 38 0           batch_normalization_82[0][0]     
____________________________________________________________________________________________________
conv2d_79 (Conv2D)               (None, None, None, 38 442368      activation_78[0][0]              
____________________________________________________________________________________________________
conv2d_80 (Conv2D)               (None, None, None, 38 442368      activation_78[0][0]              
____________________________________________________________________________________________________
conv2d_83 (Conv2D)               (None, None, None, 38 442368      activation_82[0][0]              
____________________________________________________________________________________________________
conv2d_84 (Conv2D)               (None, None, None, 38 442368      activation_82[0][0]              
____________________________________________________________________________________________________
average_pooling2d_8 (AveragePool (None, None, None, 12 0           mixed8[0][0]                     
____________________________________________________________________________________________________
conv2d_77 (Conv2D)               (None, None, None, 32 409600      mixed8[0][0]                     
____________________________________________________________________________________________________
batch_normalization_79 (BatchNor (None, None, None, 38 1152        conv2d_79[0][0]                  
____________________________________________________________________________________________________
batch_normalization_80 (BatchNor (None, None, None, 38 1152        conv2d_80[0][0]                  
____________________________________________________________________________________________________
batch_normalization_83 (BatchNor (None, None, None, 38 1152        conv2d_83[0][0]                  
____________________________________________________________________________________________________
batch_normalization_84 (BatchNor (None, None, None, 38 1152        conv2d_84[0][0]                  
____________________________________________________________________________________________________
conv2d_85 (Conv2D)               (None, None, None, 19 245760      average_pooling2d_8[0][0]        
____________________________________________________________________________________________________
batch_normalization_77 (BatchNor (None, None, None, 32 960         conv2d_77[0][0]                  
____________________________________________________________________________________________________
activation_79 (Activation)       (None, None, None, 38 0           batch_normalization_79[0][0]     
____________________________________________________________________________________________________
activation_80 (Activation)       (None, None, None, 38 0           batch_normalization_80[0][0]     
____________________________________________________________________________________________________
activation_83 (Activation)       (None, None, None, 38 0           batch_normalization_83[0][0]     
____________________________________________________________________________________________________
activation_84 (Activation)       (None, None, None, 38 0           batch_normalization_84[0][0]     
____________________________________________________________________________________________________
batch_normalization_85 (BatchNor (None, None, None, 19 576         conv2d_85[0][0]                  
____________________________________________________________________________________________________
activation_77 (Activation)       (None, None, None, 32 0           batch_normalization_77[0][0]     
____________________________________________________________________________________________________
mixed9_0 (Concatenate)           (None, None, None, 76 0           activation_79[0][0]              
                                                                   activation_80[0][0]              
____________________________________________________________________________________________________
concatenate_1 (Concatenate)      (None, None, None, 76 0           activation_83[0][0]              
                                                                   activation_84[0][0]              
____________________________________________________________________________________________________
activation_85 (Activation)       (None, None, None, 19 0           batch_normalization_85[0][0]     
____________________________________________________________________________________________________
mixed9 (Concatenate)             (None, None, None, 20 0           activation_77[0][0]              
                                                                   mixed9_0[0][0]                   
                                                                   concatenate_1[0][0]              
                                                                   activation_85[0][0]              
____________________________________________________________________________________________________
conv2d_90 (Conv2D)               (None, None, None, 44 917504      mixed9[0][0]                     
____________________________________________________________________________________________________
batch_normalization_90 (BatchNor (None, None, None, 44 1344        conv2d_90[0][0]                  
____________________________________________________________________________________________________
activation_90 (Activation)       (None, None, None, 44 0           batch_normalization_90[0][0]     
____________________________________________________________________________________________________
conv2d_87 (Conv2D)               (None, None, None, 38 786432      mixed9[0][0]                     
____________________________________________________________________________________________________
conv2d_91 (Conv2D)               (None, None, None, 38 1548288     activation_90[0][0]              
____________________________________________________________________________________________________
batch_normalization_87 (BatchNor (None, None, None, 38 1152        conv2d_87[0][0]                  
____________________________________________________________________________________________________
batch_normalization_91 (BatchNor (None, None, None, 38 1152        conv2d_91[0][0]                  
____________________________________________________________________________________________________
activation_87 (Activation)       (None, None, None, 38 0           batch_normalization_87[0][0]     
____________________________________________________________________________________________________
activation_91 (Activation)       (None, None, None, 38 0           batch_normalization_91[0][0]     
____________________________________________________________________________________________________
conv2d_88 (Conv2D)               (None, None, None, 38 442368      activation_87[0][0]              
____________________________________________________________________________________________________
conv2d_89 (Conv2D)               (None, None, None, 38 442368      activation_87[0][0]              
____________________________________________________________________________________________________
conv2d_92 (Conv2D)               (None, None, None, 38 442368      activation_91[0][0]              
____________________________________________________________________________________________________
conv2d_93 (Conv2D)               (None, None, None, 38 442368      activation_91[0][0]              
____________________________________________________________________________________________________
average_pooling2d_9 (AveragePool (None, None, None, 20 0           mixed9[0][0]                     
____________________________________________________________________________________________________
conv2d_86 (Conv2D)               (None, None, None, 32 655360      mixed9[0][0]                     
____________________________________________________________________________________________________
batch_normalization_88 (BatchNor (None, None, None, 38 1152        conv2d_88[0][0]                  
____________________________________________________________________________________________________
batch_normalization_89 (BatchNor (None, None, None, 38 1152        conv2d_89[0][0]                  
____________________________________________________________________________________________________
batch_normalization_92 (BatchNor (None, None, None, 38 1152        conv2d_92[0][0]                  
____________________________________________________________________________________________________
batch_normalization_93 (BatchNor (None, None, None, 38 1152        conv2d_93[0][0]                  
____________________________________________________________________________________________________
conv2d_94 (Conv2D)               (None, None, None, 19 393216      average_pooling2d_9[0][0]        
____________________________________________________________________________________________________
batch_normalization_86 (BatchNor (None, None, None, 32 960         conv2d_86[0][0]                  
____________________________________________________________________________________________________
activation_88 (Activation)       (None, None, None, 38 0           batch_normalization_88[0][0]     
____________________________________________________________________________________________________
activation_89 (Activation)       (None, None, None, 38 0           batch_normalization_89[0][0]     
____________________________________________________________________________________________________
activation_92 (Activation)       (None, None, None, 38 0           batch_normalization_92[0][0]     
____________________________________________________________________________________________________
activation_93 (Activation)       (None, None, None, 38 0           batch_normalization_93[0][0]     
____________________________________________________________________________________________________
batch_normalization_94 (BatchNor (None, None, None, 19 576         conv2d_94[0][0]                  
____________________________________________________________________________________________________
activation_86 (Activation)       (None, None, None, 32 0           batch_normalization_86[0][0]     
____________________________________________________________________________________________________
mixed9_1 (Concatenate)           (None, None, None, 76 0           activation_88[0][0]              
                                                                   activation_89[0][0]              
____________________________________________________________________________________________________
concatenate_2 (Concatenate)      (None, None, None, 76 0           activation_92[0][0]              
                                                                   activation_93[0][0]              
____________________________________________________________________________________________________
activation_94 (Activation)       (None, None, None, 19 0           batch_normalization_94[0][0]     
____________________________________________________________________________________________________
mixed10 (Concatenate)            (None, None, None, 20 0           activation_86[0][0]              
                                                                   mixed9_1[0][0]                   
                                                                   concatenate_2[0][0]              
                                                                   activation_94[0][0]              
____________________________________________________________________________________________________
global_average_pooling2d_3 (Glob (None, 2048)          0           mixed10[0][0]                    
____________________________________________________________________________________________________
dense_9 (Dense)                  (None, 1000)          2049000     global_average_pooling2d_3[0][0] 
____________________________________________________________________________________________________
dense_10 (Dense)                 (None, 200)           200200      dense_9[0][0]                    
____________________________________________________________________________________________________
dense_11 (Dense)                 (None, 50)            10050       dense_10[0][0]                   
____________________________________________________________________________________________________
dense_12 (Dense)                 (None, 7)             357         dense_11[0][0]                   
====================================================================================================
Total params: 24,062,391
Trainable params: 2,259,607
Non-trainable params: 21,802,784
____________________________________________________________________________________________________
In [48]:
my_model_2.compile(optimizer='rmsprop', loss='categorical_crossentropy', metrics=['accuracy'])
checkpointer = ModelCheckpoint(filepath='my_model_2.h5',
                              verbose=1, save_best_only=True)
early_stopping = EarlyStopping(monitor='val_loss', min_delta=0, patience=30, verbose=1, mode='auto')
reduce_lr = ReduceLROnPlateau(monitor='val_loss', factor=0.9, patience=10, cooldown=0, min_lr=0.00001)
lr_print = LambdaCallback(on_epoch_begin=lambda epoch, logs: print('lr:', K.eval(my_model_2.optimizer.lr)))
In [49]:
hist_2 = my_model_2.fit(train_tensors, train_targets,
            validation_data=(validate_tensors, validate_targets),
            epochs=200, verbose=1, batch_size=20,
            callbacks=[checkpointer, early_stopping, reduce_lr, lr_print])
Train on 257 samples, validate on 73 samples
lr: 0.001
Epoch 1/200
240/257 [===========================>..] - ETA: 0s - loss: 2.9585 - acc: 0.1375Epoch 00000: val_loss improved from inf to 1.97502, saving model to my_model.h5
257/257 [==============================] - 5s - loss: 2.9185 - acc: 0.1323 - val_loss: 1.9750 - val_acc: 0.2740
lr: 0.001
Epoch 2/200
240/257 [===========================>..] - ETA: 0s - loss: 1.9198 - acc: 0.2833Epoch 00001: val_loss improved from 1.97502 to 1.74916, saving model to my_model.h5
257/257 [==============================] - 4s - loss: 1.9081 - acc: 0.2879 - val_loss: 1.7492 - val_acc: 0.4110
lr: 0.001
Epoch 3/200
240/257 [===========================>..] - ETA: 0s - loss: 1.6956 - acc: 0.3417Epoch 00002: val_loss did not improve
257/257 [==============================] - 2s - loss: 1.6857 - acc: 0.3346 - val_loss: 2.0031 - val_acc: 0.1918
lr: 0.001
Epoch 4/200
240/257 [===========================>..] - ETA: 0s - loss: 1.6955 - acc: 0.3500Epoch 00003: val_loss did not improve
257/257 [==============================] - 2s - loss: 1.6810 - acc: 0.3424 - val_loss: 1.8424 - val_acc: 0.3151
lr: 0.001
Epoch 5/200
240/257 [===========================>..] - ETA: 0s - loss: 1.5297 - acc: 0.3958Epoch 00004: val_loss improved from 1.74916 to 1.72234, saving model to my_model.h5
257/257 [==============================] - 4s - loss: 1.5162 - acc: 0.4047 - val_loss: 1.7223 - val_acc: 0.2329
lr: 0.001
Epoch 6/200
240/257 [===========================>..] - ETA: 0s - loss: 1.3009 - acc: 0.4958Epoch 00005: val_loss improved from 1.72234 to 1.39820, saving model to my_model.h5
257/257 [==============================] - 5s - loss: 1.2774 - acc: 0.4981 - val_loss: 1.3982 - val_acc: 0.3562
lr: 0.001
Epoch 7/200
240/257 [===========================>..] - ETA: 0s - loss: 1.0890 - acc: 0.5458Epoch 00006: val_loss did not improve
257/257 [==============================] - 2s - loss: 1.1237 - acc: 0.5370 - val_loss: 1.7943 - val_acc: 0.3151
lr: 0.001
Epoch 8/200
240/257 [===========================>..] - ETA: 0s - loss: 1.0804 - acc: 0.5167Epoch 00007: val_loss improved from 1.39820 to 0.86087, saving model to my_model.h5
257/257 [==============================] - 3s - loss: 1.0819 - acc: 0.5136 - val_loss: 0.8609 - val_acc: 0.6575
lr: 0.001
Epoch 9/200
240/257 [===========================>..] - ETA: 0s - loss: 0.9990 - acc: 0.6083Epoch 00008: val_loss did not improve
257/257 [==============================] - 2s - loss: 1.0026 - acc: 0.6070 - val_loss: 0.9568 - val_acc: 0.5753
lr: 0.001
Epoch 10/200
240/257 [===========================>..] - ETA: 0s - loss: 0.8122 - acc: 0.6583Epoch 00009: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.8672 - acc: 0.6420 - val_loss: 1.0624 - val_acc: 0.5342
lr: 0.001
Epoch 11/200
240/257 [===========================>..] - ETA: 0s - loss: 0.5883 - acc: 0.7792Epoch 00010: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.6004 - acc: 0.7743 - val_loss: 0.8974 - val_acc: 0.7123
lr: 0.001
Epoch 12/200
240/257 [===========================>..] - ETA: 0s - loss: 0.7657 - acc: 0.7292Epoch 00011: val_loss improved from 0.86087 to 0.55924, saving model to my_model.h5
257/257 [==============================] - 4s - loss: 0.7343 - acc: 0.7393 - val_loss: 0.5592 - val_acc: 0.8356
lr: 0.001
Epoch 13/200
240/257 [===========================>..] - ETA: 0s - loss: 0.6127 - acc: 0.7292Epoch 00012: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.5896 - acc: 0.7432 - val_loss: 0.7347 - val_acc: 0.7123
lr: 0.001
Epoch 14/200
240/257 [===========================>..] - ETA: 0s - loss: 0.8725 - acc: 0.7375Epoch 00013: val_loss improved from 0.55924 to 0.54739, saving model to my_model.h5
257/257 [==============================] - 3s - loss: 0.8594 - acc: 0.7393 - val_loss: 0.5474 - val_acc: 0.8219
lr: 0.001
Epoch 15/200
240/257 [===========================>..] - ETA: 0s - loss: 0.5763 - acc: 0.7750Epoch 00014: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.5851 - acc: 0.7704 - val_loss: 0.8787 - val_acc: 0.6438
lr: 0.001
Epoch 16/200
240/257 [===========================>..] - ETA: 0s - loss: 0.6774 - acc: 0.7417Epoch 00015: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.6451 - acc: 0.7549 - val_loss: 0.6472 - val_acc: 0.7260
lr: 0.001
Epoch 17/200
240/257 [===========================>..] - ETA: 0s - loss: 0.2294 - acc: 0.9083Epoch 00016: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.2270 - acc: 0.9105 - val_loss: 1.0457 - val_acc: 0.5616
lr: 0.001
Epoch 18/200
240/257 [===========================>..] - ETA: 0s - loss: 0.3838 - acc: 0.8458Epoch 00017: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.3878 - acc: 0.8405 - val_loss: 1.2631 - val_acc: 0.6849
lr: 0.001
Epoch 19/200
240/257 [===========================>..] - ETA: 0s - loss: 0.4828 - acc: 0.8208Epoch 00018: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.5108 - acc: 0.8171 - val_loss: 1.1792 - val_acc: 0.6027
lr: 0.001
Epoch 20/200
240/257 [===========================>..] - ETA: 0s - loss: 0.4756 - acc: 0.8292Epoch 00019: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.5270 - acc: 0.8210 - val_loss: 1.8096 - val_acc: 0.4521
lr: 0.001
Epoch 21/200
240/257 [===========================>..] - ETA: 0s - loss: 0.5288 - acc: 0.8167Epoch 00020: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.5071 - acc: 0.8249 - val_loss: 0.6295 - val_acc: 0.7808
lr: 0.001
Epoch 22/200
240/257 [===========================>..] - ETA: 0s - loss: 0.2253 - acc: 0.9458Epoch 00021: val_loss improved from 0.54739 to 0.48412, saving model to my_model.h5
257/257 [==============================] - 4s - loss: 0.2151 - acc: 0.9494 - val_loss: 0.4841 - val_acc: 0.8493
lr: 0.001
Epoch 23/200
240/257 [===========================>..] - ETA: 0s - loss: 0.3358 - acc: 0.8583Epoch 00022: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.3316 - acc: 0.8599 - val_loss: 0.7439 - val_acc: 0.7123
lr: 0.001
Epoch 24/200
240/257 [===========================>..] - ETA: 0s - loss: 0.4791 - acc: 0.8583Epoch 00023: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.4879 - acc: 0.8482 - val_loss: 0.8976 - val_acc: 0.6849
lr: 0.001
Epoch 25/200
240/257 [===========================>..] - ETA: 0s - loss: 0.1387 - acc: 0.9625Epoch 00024: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.1731 - acc: 0.9494 - val_loss: 0.7415 - val_acc: 0.7260
lr: 0.001
Epoch 26/200
240/257 [===========================>..] - ETA: 0s - loss: 0.1552 - acc: 0.9250Epoch 00025: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.1565 - acc: 0.9261 - val_loss: 1.2601 - val_acc: 0.6849
lr: 0.001
Epoch 27/200
240/257 [===========================>..] - ETA: 0s - loss: 0.4087 - acc: 0.9000Epoch 00026: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.4248 - acc: 0.8872 - val_loss: 0.8292 - val_acc: 0.7534
lr: 0.001
Epoch 28/200
240/257 [===========================>..] - ETA: 0s - loss: 0.4408 - acc: 0.8667Epoch 00027: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.4153 - acc: 0.8755 - val_loss: 0.4934 - val_acc: 0.8630
lr: 0.001
Epoch 29/200
240/257 [===========================>..] - ETA: 0s - loss: 0.2097 - acc: 0.9208Epoch 00028: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.2038 - acc: 0.9222 - val_loss: 0.5410 - val_acc: 0.8630
lr: 0.001
Epoch 30/200
240/257 [===========================>..] - ETA: 0s - loss: 0.2540 - acc: 0.9417Epoch 00029: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.2455 - acc: 0.9455 - val_loss: 0.8358 - val_acc: 0.7945
lr: 0.001
Epoch 31/200
240/257 [===========================>..] - ETA: 0s - loss: 0.5147 - acc: 0.8375Epoch 00030: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.4893 - acc: 0.8482 - val_loss: 0.5281 - val_acc: 0.8356
lr: 0.001
Epoch 32/200
240/257 [===========================>..] - ETA: 0s - loss: 0.1216 - acc: 0.9542Epoch 00031: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.1247 - acc: 0.9494 - val_loss: 0.5002 - val_acc: 0.8219
lr: 0.001
Epoch 33/200
240/257 [===========================>..] - ETA: 0s - loss: 0.3019 - acc: 0.8833Epoch 00032: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.2946 - acc: 0.8872 - val_loss: 1.1694 - val_acc: 0.7534
lr: 0.0009
Epoch 34/200
240/257 [===========================>..] - ETA: 0s - loss: 0.3009 - acc: 0.9125Epoch 00033: val_loss improved from 0.48412 to 0.46785, saving model to my_model.h5
257/257 [==============================] - 4s - loss: 0.2844 - acc: 0.9183 - val_loss: 0.4678 - val_acc: 0.8630
lr: 0.0009
Epoch 35/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0751 - acc: 0.9750Epoch 00034: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.0719 - acc: 0.9767 - val_loss: 0.4713 - val_acc: 0.8767
lr: 0.0009
Epoch 36/200
240/257 [===========================>..] - ETA: 0s - loss: 0.1233 - acc: 0.9583Epoch 00035: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.1241 - acc: 0.9572 - val_loss: 2.0185 - val_acc: 0.6164
lr: 0.0009
Epoch 37/200
240/257 [===========================>..] - ETA: 0s - loss: 0.1066 - acc: 0.9708Epoch 00036: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.1010 - acc: 0.9728 - val_loss: 0.5756 - val_acc: 0.8219
lr: 0.0009
Epoch 38/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0980 - acc: 0.9708Epoch 00037: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.0956 - acc: 0.9728 - val_loss: 0.7598 - val_acc: 0.8082
lr: 0.0009
Epoch 39/200
240/257 [===========================>..] - ETA: 0s - loss: 0.3566 - acc: 0.9125Epoch 00038: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.3396 - acc: 0.9144 - val_loss: 0.9270 - val_acc: 0.7397
lr: 0.0009
Epoch 40/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0675 - acc: 0.9792Epoch 00039: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.0639 - acc: 0.9805 - val_loss: 0.4866 - val_acc: 0.8630
lr: 0.0009
Epoch 41/200
240/257 [===========================>..] - ETA: 0s - loss: 0.4320 - acc: 0.9167Epoch 00040: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.4041 - acc: 0.9222 - val_loss: 0.6852 - val_acc: 0.8356
lr: 0.0009
Epoch 42/200
240/257 [===========================>..] - ETA: 0s - loss: 0.2490 - acc: 0.9500Epoch 00041: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.2339 - acc: 0.9533 - val_loss: 0.5326 - val_acc: 0.8630
lr: 0.0009
Epoch 43/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0112 - acc: 1.0000Epoch 00042: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.0215 - acc: 0.9922 - val_loss: 1.1828 - val_acc: 0.7671
lr: 0.0009
Epoch 44/200
240/257 [===========================>..] - ETA: 0s - loss: 0.2867 - acc: 0.9167Epoch 00043: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.2698 - acc: 0.9222 - val_loss: 0.5117 - val_acc: 0.8356
lr: 0.0009
Epoch 45/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0573 - acc: 0.9875Epoch 00044: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.0537 - acc: 0.9883 - val_loss: 0.5207 - val_acc: 0.8630
lr: 0.00081
Epoch 46/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0511 - acc: 0.9750Epoch 00045: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.0484 - acc: 0.9767 - val_loss: 0.7573 - val_acc: 0.8356
lr: 0.00081
Epoch 47/200
240/257 [===========================>..] - ETA: 0s - loss: 0.1364 - acc: 0.9583Epoch 00046: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.1278 - acc: 0.9611 - val_loss: 0.5719 - val_acc: 0.8356
lr: 0.00081
Epoch 48/200
240/257 [===========================>..] - ETA: 0s - loss: 0.1301 - acc: 0.9500Epoch 00047: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.1230 - acc: 0.9533 - val_loss: 0.6137 - val_acc: 0.8630
lr: 0.00081
Epoch 49/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0947 - acc: 0.9667Epoch 00048: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.0896 - acc: 0.9689 - val_loss: 0.5815 - val_acc: 0.8767
lr: 0.00081
Epoch 50/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0676 - acc: 0.9792Epoch 00049: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.3419 - acc: 0.9377 - val_loss: 1.1918 - val_acc: 0.7397
lr: 0.00081
Epoch 51/200
240/257 [===========================>..] - ETA: 0s - loss: 0.1527 - acc: 0.9417Epoch 00050: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.1434 - acc: 0.9455 - val_loss: 0.5278 - val_acc: 0.8904
lr: 0.00081
Epoch 52/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0062 - acc: 1.0000Epoch 00051: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.0075 - acc: 1.0000 - val_loss: 0.7006 - val_acc: 0.8082
lr: 0.00081
Epoch 53/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0070 - acc: 1.0000Epoch 00052: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.0077 - acc: 1.0000 - val_loss: 1.0008 - val_acc: 0.8082
lr: 0.00081
Epoch 54/200
240/257 [===========================>..] - ETA: 0s - loss: 0.3842 - acc: 0.9042Epoch 00053: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.3589 - acc: 0.9105 - val_loss: 0.6370 - val_acc: 0.8630
lr: 0.00081
Epoch 55/200
240/257 [===========================>..] - ETA: 0s - loss: 0.4217 - acc: 0.8917Epoch 00054: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.3989 - acc: 0.8988 - val_loss: 0.6382 - val_acc: 0.8493
lr: 0.000729
Epoch 56/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0287 - acc: 0.9875Epoch 00055: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.0281 - acc: 0.9883 - val_loss: 0.6168 - val_acc: 0.8493
lr: 0.000729
Epoch 57/200
240/257 [===========================>..] - ETA: 0s - loss: 0.1232 - acc: 0.9500Epoch 00056: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.1169 - acc: 0.9533 - val_loss: 0.7441 - val_acc: 0.8630
lr: 0.000729
Epoch 58/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0205 - acc: 0.9958Epoch 00057: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.0223 - acc: 0.9922 - val_loss: 0.8706 - val_acc: 0.8493
lr: 0.000729
Epoch 59/200
240/257 [===========================>..] - ETA: 0s - loss: 0.1413 - acc: 0.9583Epoch 00058: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.1323 - acc: 0.9611 - val_loss: 0.6593 - val_acc: 0.8767
lr: 0.000729
Epoch 60/200
240/257 [===========================>..] - ETA: 0s - loss: 0.1392 - acc: 0.9542Epoch 00059: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.1583 - acc: 0.9494 - val_loss: 0.8336 - val_acc: 0.8082
lr: 0.000729
Epoch 61/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0112 - acc: 0.9958Epoch 00060: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.0106 - acc: 0.9961 - val_loss: 0.6927 - val_acc: 0.8767
lr: 0.000729
Epoch 62/200
240/257 [===========================>..] - ETA: 0s - loss: 0.2693 - acc: 0.9625Epoch 00061: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.2519 - acc: 0.9650 - val_loss: 0.6668 - val_acc: 0.8493
lr: 0.000729
Epoch 63/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0131 - acc: 0.9958Epoch 00062: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.0123 - acc: 0.9961 - val_loss: 0.7234 - val_acc: 0.8904
lr: 0.000729
Epoch 64/200
240/257 [===========================>..] - ETA: 0s - loss: 0.0045 - acc: 1.0000Epoch 00063: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.0126 - acc: 0.9961 - val_loss: 2.7844 - val_acc: 0.5753
lr: 0.000729
Epoch 65/200
240/257 [===========================>..] - ETA: 0s - loss: 0.3014 - acc: 0.9292Epoch 00064: val_loss did not improve
257/257 [==============================] - 2s - loss: 0.2814 - acc: 0.9339 - val_loss: 0.7638 - val_acc: 0.8904
Epoch 00064: early stopping
In [50]:
plt_hist(hist_2)
dict_keys(['loss', 'val_loss', 'val_acc', 'acc', 'lr'])
In [52]:
my_model_2 = load_model('./my_model_2.h5')
In [53]:
score_2 = my_model_2.evaluate(test_tensors, test_targets, verbose=1)
print('Test loss:', score_2[0])
print('Test accuracy:', score_2[1])
96/98 [============================>.] - ETA: 0sTest loss: 0.32326669839
Test accuracy: 0.887755102041
In [54]:
score_1 = my_model_1.evaluate(test_tensors, test_targets, verbose=1)
print('Test loss:', score_1[0])
print('Test accuracy:', score_1[1])
96/98 [============================>.] - ETA: 0sTest loss: 0.547816322774
Test accuracy: 0.816326530612

More data actually helps improve the accuracy. The second model shows 7% higher accuracy against the same set of testing data.

Part 4 - Face recognition on sample photos

There are two family photos 'test_1.jpg' and 'test_2.jpg' in the root of working directory never been used for training or testing in the earlier sections. Let's perform a face detection using the show_faces().

In [76]:
show_faces('./test_1.jpg', True, False, 1.3, 7)
Image path ./test_1.jpg
(4912, 7360, 3)
Number of faces detected: 7
In [77]:
show_faces('./test_2.jpg', True, False, 1.35, 5)
Image path ./test_2.jpg
(4766, 7141, 3)
Number of faces detected: 4

One test sample on using the model to predict and return the result as index of the face_names

In [147]:
one_test = test_tensors[56].reshape(-1, 299, 299, 3)
predict_idx = np.argmax(my_model_2.predict(one_test))
print(test_faces[56])
print(face_names)
print('The predicted face is: ', face_names[predict_idx])
./images/Test\Dad\20110204-1825.jpg-face-0.jpg
['Brother', 'Dad', 'Daughter', 'Me', 'Mum', 'Son', 'Wife']
The predicted face is:  Dad

Now let's make customized detect_faces() and recognize_faces()

In [158]:
def recognize_face(image_face):
    # Conver the 3 channel RGB to 4-d tensor and normalize it
    image_face = image_face.reshape(-1, 299, 299, 3)/255
    predict_idx = np.argmax(my_model_2.predict(image_face))
    return face_names[predict_idx]
In [213]:
def detect_faces(file_path, display=False, recognize=False, scaleFactor=1.3, minNeighb=5):
    print('Image path', file_path)
    
    # The file path contains unicode characters, cannot use cv2.imread() directly
    file_stream = open(file_path, 'rb')
    bytes_arr = bytearray(file_stream.read())
    numpy_ar = np.asarray(bytes_arr, dtype=np.uint8)
    image = cv2.imdecode(numpy_ar, cv2.IMREAD_UNCHANGED)
    print(image.shape)
    
    # Convert to RGB
    image = cv2.cvtColor(image, cv2.COLOR_BGR2RGB)
    # Convert the RGB  image to grayscale
    gray = cv2.cvtColor(image, cv2.COLOR_RGB2GRAY)

    # Extract the pre-trained face detector from an xml file
    face_cascade = cv2.CascadeClassifier('detector_architectures/haarcascade_frontalface_default.xml')

    # Detect the faces in image
    faces = face_cascade.detectMultiScale(gray, scaleFactor, minNeighb)

    # Print the number of faces detected in the image
    print('Number of faces detected:', len(faces))

    # Make a copy of the orginal image to draw face detections on
    image_with_detections = np.copy(image)

    # The list of detected faces
    image_faces = []
    # Get the bounding box for each detected face
    for (x,y,w,h) in faces:
        # Add a red bounding box to the detections image
        if w > 200:
            line_width = w//20
        else:
            line_width = 3
        cur_face = image[y:(y+h), x:(x+w)]
        image_faces.append(cur_face)
        cv2.rectangle(image_with_detections, (x,y), (x+w,y+h), (255,0,0), line_width)
        if recognize:
            cur_face = cv2.resize(cur_face, (299,299))
            name = recognize_face(cur_face)
            
            print('x,y,w,h', x,y,w,h)
            print(name)
            
            cv2.putText(image_with_detections, name,
                        (x,y-100),cv2.FONT_HERSHEY_SIMPLEX, 7, (255,0,0),20,cv2.LINE_AA)

    if display:
        # Display the image with the detections
        fig = plt.figure(figsize=(15, 15))
        ax = fig.add_subplot(1, 1, 1, xticks=[], yticks=[])
        ax.set_title('Test Image')
        ax.imshow(image_with_detections)
    os.chdir(cur_dir)
In [214]:
detect_faces('./test_2.jpg', True, True, 1.35, 5)
Image path ./test_2.jpg
(4766, 7141, 3)
Number of faces detected: 4
x,y,w,h 1997 1163 658 658
Mum
x,y,w,h 2798 1090 546 546
Daughter
x,y,w,h 3805 1292 614 614
Son
x,y,w,h 4285 857 765 765
Dad
In [212]:
detect_faces('./test_1.jpg', True, True, 1.3, 7)
Image path ./test_1.jpg
(4912, 7360, 3)
Number of faces detected: 7
x,y,w,h 2080 1051 459 459
Mum
x,y,w,h 3923 750 477 477
Brother
x,y,w,h 2961 2047 430 430
Wife
x,y,w,h 1693 1509 550 550
Dad
x,y,w,h 2166 2300 427 427
Son
x,y,w,h 5083 1313 518 518
Brother
x,y,w,h 3392 2045 302 302
Daughter

And now I finally have a 'theoretical' evidence that how much I look like my brother. Depite we're not twins, many friends say we look 'exactly' like twins!!!